Would you rate-control Googlebot? How much crawling is too much crawling?
-
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day.
A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors.
I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is.
Questions to Enterprise SEOs:
*Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached.
*We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have.
- Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back?
*What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate.
Thanks
-
I agree with Matt that there can probably be a reduction of pages, but that aside, how much of an issue this is comes down to what pages aren't being indexed. It's hard to advise without the site, are you able to share the domain? If the site has been around for a long time, that seems a low level of indexation. Is this a site where the age of the content matters? For example Craigslist?
Craig
-
Thanks for your response. I get where you're going with that. (Ecomm store gone bad.) It's not actually an Ecomm FWIW. And I do restrict parameters - the list is about a page and a half long. It's a legitimately large site.
You're correct - I don't want Google to crawl the full 500M. But I do want them to crawl 100M. At the current crawl rate we limit them to, it's going to take Google more than 3 months to get to each page a single time. I'd actually like to let them crawl 3M pages a day. Is that an insane amount of Googlebot bandwidth? Does anyone else have a similar situation?
-
Gosh, that's a HUGE site. Are you having Google crawl parameter pages with that? If so, that's a bigger issue.
I can't imagine the crawl issues with 500M pages. A site:amazon.com search only returns 200M. Ebay.com returns 800M so your site is somewhere in between these two? (I understand both probably have a lot more - but not returning as indexed.)
You always WANT a full site crawl - but your techs do have a point. Unless there's an absolutely necessary reason to have 500M indexed pages, I'd also seek to cut that to what you want indexed. That sounds like a nightmare ecommerce store gone bad.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
Hello, My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages. I have contacted my theme company but not sure what could have done this. Any ideas? The original posts/pages are still correct and working it just looks like it did duplicates and added void(0 to the end of each post/page. Questions: There is no way to undo this correct? Do I have to do a redirect on each of these? Will this hurt my rankings and domain authority? Any suggestions would be appreciated. Thanks, Wade
Intermediate & Advanced SEO | | neverenoughmusic.com0 -
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
How much change should you make to your site in one go?
Hey everyone, So we are currently working on a new website and are in the final stages right now. We have some plans for a brand name change too and there is some debate internally whether we should: a) roll out the new site now and hold off on the rebrand - let the redirects kick in and the site bed in so to speak. Then when the dust settles look at a domain name change b) Roll out the new site with the domain name change too - an all in change A bit of background on the changes being made: The new website will have some structural changes but the main blog content will remain the same - this is where we get the majority of our traffic. The blog will have a slight page layout change but the core content, structure, urls, etc. will be exactly the same. The core website surrounding the blog will change with 301 redirects from old out of date content pages consolidated to fewer, more relevant pages. I hope I've explained enough here, if not please let me know and I'll add more detail
Intermediate & Advanced SEO | | hotchilidamo0 -
Magento E-Commerce Crawl Issues
Hi Guys, First post here! I am responsible for a Magento e-commerce store and there are a few crawl issues and potential solutions that I am working and would like to get some advice to see if you agree with my approach. Old Product Pages - The majority of our stock is seasonal, therefore when a product sells out, it is not usually going to come back into stock. However the approach for Magento websites is to leave the page present but take the product off the category pages, so users can still find these pages from the search engines and they are orphaned pages as not linked to from elsewhere and not totally clear products are out of stock (just doesn't show the size pulldown or 'Add to Basket' button). There is no process in place to 301 redirect these pages either. My solution to this problem is to: 1. Change design of these pages so a clear message is shown to users that the product is out of stock and suggest related products to reduce bounce rates. I was also planning on having a link from an 'Out of Stock' page on the site to these products so they are orphaned but is this required do you think? 2. When I know for sure (e.g. over a month) that the product will not be returned (e.g. refund) by the user, then 301 redirect the product pages back to category page. How do other users 301 redirect their pages in Magento, I would like an easy to use system. Crawl Errors Identified in Google Webmaster Tools It seems in the last 2 weeks there has been a sharp increase in the number of soft 404 pages identified on the website. When I inspect these pages they seem to be categories and sub categories that no longer have any products in them. However, I don't want to delete these pages as new products might come in and go onto these category pages, therefore how should I approach this? A suggestion I have thought of is to put related products on to these pages? Any better ideas? Thanks, Graeme
Intermediate & Advanced SEO | | graeme19940 -
Will a disclaimer affect Crawling?
Hello everyone! My German users will have to get a disclaimer according to German laws, now my question is the following: Will a disclaimer affect crawling? What's the best practice to have regarding this? Should I have special care in this? What's the best disclaimer technique? A Plain HTML page? Something overlapping the site? Thank you all!
Intermediate & Advanced SEO | | NelsonF0 -
If I had an issue with a friendly URL module and I lost all my rankings. Will they return now that issue is resolved next time I'm crawled by google?
I have 'magic seo urls' installed on my zencart site. Except for some reason no one can explain why or how the files were disabled. So my static links went back to dynamic (index.php?**********) etc. The issue was resolved with the module except in that time google must have crawled my site and I lost all my rankings. I'm nowher to be found in the top 50. Did this really cause such an extravagant SEO issue as my web developers told me? Can I expect my rankings to return next time my site is crawled by google?
Intermediate & Advanced SEO | | Pete790 -
Controlling PageRank vs flat site architecture
Hey all. Here's the scenario. I have this pretty trusted site with a relatively high PR. The navigation menu has around 300 links. But this is because it is a CSS menu that drills down into subcategories. Now, would restricting the amount of links in this menu be beneficial? I am not worried about any subcategory pages not being crawled or indexed, but I am concerned that subcategory pages will not receive as high of PageRank if they are not linked to directly from the home page, thereby lowering the ranking potential. Even with new pages that are created they receive a PR of 5 if linked to from the home page. But I'm also thinking that toning down the menu size would be beneficial by funneling more PageRank to category pages and increasing the likelihood of ranking for some core head/middle terms. I have seen sites that externalize the menu in JavaScript files and disallow it in Robots.txt to prevent too much PageRank from linking out, but SEO isn't really a one-solution-fits-all in my experience. I may try a test. Externalizing the menu may also increase the relevance for pages because I won't have a bunch of other content on the page not relevant to that page's specific keywords. Anyone with experience in this arena? I would love to hear your input. Thanks
Intermediate & Advanced SEO | | JeremyNelson580 -
Working out exactly how Google is crawling my site if I have loooots of pages
I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated Cheers
Intermediate & Advanced SEO | | soeren.hofmayer0