Pages not Indexed after a successful Google Fetch
-
I am trying to understand why google isn't indexing key content on my site.
www.BeyondTransition.com is indexed and new pages show up in a couple of hours.
My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform).
These pages are reached via a search page, but no direct navigation from the home page.
When I link to an event page from an indexed page it doesn't show up in search results.
When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page
e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results....
So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated
Thanks
Denis
-
Thanks Nick. I'll work through all of those points
-
Not sure if it was a connection issue on my end or what, but that page takes a very long time to load, which could explain the lack of indexing of the pages linked from it.
Also, Google states that pages submitted witht the Fetch as Googlebot tool are not guaranteed to be indexed, so there may be quite a delay on that. Are all pages included in your XML sitemap? An XML sitemap is the preferred way to notify Google of pages it may not normally find. Here is a link to more about XML sitemaps https://www.google.com/support/webmasters/bin/answer.py?answer=156184&hl=en
Even with an XML sitemap, Google may not immediately crawl many pages. Actually, indexing is rarely immediate. The frequency of crawling and speed of indexing has to do with many of the same factors as your ranking - quality, number of inbound links and pagerank, site performance, etc. If all your pages load quickly and you are in pretty good shape as far as links, etc, you could also try something to draw Google's attention to the new pages - like Tweeting a link or posting to Google+. That seems to "force" faster indexing in some cases.
I just checked your site with webpagetest.org and it is showing a load time of about 14 seconds. Tools.pingdom.com seemed to get hung up on some of the javascripts and couldn't complete its test. Doing what you can to speed up the site and address any other "quality" issues will help with indexing, and your performance in search engine results in general. -
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
Since it may not be practical to have every event linked through navigation, maybe a widget that shows the last maybe ten events would be good enough.
-
Hi Nick,
Thanks for the answer. I've got a word press plugin but I don't think it captures everything so I'm in the process of manually generating an XML site map - but I think you have you finger on the answer why pages aren't crawled
Navigation is on the list of things to do - it's working out the relative urgency.
I like the RSS idea - time for some research on how to do it.
-
You should use a XML site map to keep Google up to date with new pages. I could not find one for your site. Otherwise, if the event pages can only be found by using the search feature on your site, those pages will not probably not be crawled and indexed. you could also submit the feed to RSS sites Fetch as Googlebot may work, but it probably will not be as fast as using a sitemap.xml file.
Would it be possible to have the event pages available through some kind of navigation in addition to being found by your site's search?
You might also consider setting up an RSS feed of the events and submitting it to feed burner and other RSS sites. That may be a little complicated, but would also help speed up indexing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google keeps marking different pages as duplicates
My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks
Technical SEO | | TuanDo96270 -
Specific pages won't index
I have a few pages on my site that Google won't index, and I can't understand why. I've looked into possible issues with Robots, noindex, redirects, canonicals, and Search Console rules. I've got nothing. Example: I want this page to index https://tour.franchisebusinessreview.com/services/franchisee-satisfaction-surveys/ When I Google the full URL, I get results including the non-subdomain homepage, and various pages on the subdomain, including a child page of the page I want, but not the page itself. Any ideas? Thanks for the help!
Technical SEO | | ericstites0 -
Does google like Category pages or pages with lots of Products on them?
We are having an issue with getting Google to rank the page we want. To have this page http://www.jakewilson.com/c/52/-/346/Cruiser-Motorcycle-Tires rank for the key word Cruiser Motorcycle Tires; however, this page http://www.jakewilson.com/t/52/-/343/752/Cruiser-Motorcycle-Tires is ranking instead and it has less links and page authority according to site explorer and it is farther down in the hierarchy. I am wondering if google just likes pages that have actual products on them instead of a page leading to the page with all the products. Thoughts?
Technical SEO | | DoRM0 -
Is Google caching date same as crawling/indexing date?
If a site is cached on say 9 oct 2012 doesn't that also mean that Google crawled it on same date ? And indexed it on same date?
Technical SEO | | Personnel_Concept0 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0 -
Wrong page version in the index
Hi, my site is currently accessible through URL with and without www. The Version with www has 10 times more Backlinks (PA 45 vs 38) but is not listet into the google Index. As far as I know there was never made a google Webmaster account or declared otherwise the version without www to be 'cannonical'. Basically I think that for SEO reasons it would be much better to declare the with www version to be cannonical and redirect the without www version to it. My questions are: Do you have an idea why the with www version is not indexed?
Technical SEO | | Naturalmente
How long does Google usually take to change the version in the index?
Do I risk my site to be thrown out of the index for some days untill the change is made? Thanks in advance.0 -
Does Google see page with Trailing Slash as different
My company is purchasing another company's website. We are moving their entire site onto our CMS and the IT guys are working hard to replicate the URL structure. Several of the category pages are changing slightly and I am not sure if it matters: Old URL - http://www.DOMAIN.com/products/adults New URL - http://www.DOMAIN.com/products/adults**/** Notice the trailing slash? Will Google treat the new page as the same as the old one or as completely different (i.e. new) page? P.S. - Yes, I can setup 301s but since these pages hold decent rankings I'd really like to keep it exactly the same.
Technical SEO | | costume0 -
If googlebot fetch doesnt find our site will it be indexed?
We have a problem with one of our sites (wordpress) not getting fetched by googlebot. Some folders on the url get found others not. So we have isolated it as a wordpress issue. Will this affect our page in google serps anytime soon? Does any whizz kid out there know how to begin fixing this as we have spent two days solid on this. url is www.holden-jones.co.uk Thanks in advance guys Rob
Technical SEO | | wonderwall0