Should I set a max crawl rate in Webmaster Tools?
-
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue).
Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing?
I found this post on Moz, but it's dated from 2008. Any thoughts on this?
-
At first I assumed that by manually setting the crawl rate to the maximum, Google would crawl my website faster and more frequently. Our website has tens of thousands of pages so I didn't want Google missing any of it or taking a long time to index new content. We have new products added to the website daily and others that come off or change.
I'll let Google decide
-
Yep, they're a little vague here! But the answer is: Google will crawl your site at whatever rate it wants (it's probably crawling Amazon 24/7), unless you limit how much it can crawl in Google Webmaster Tools. Then, Google will crawl your site at whatever rate it wants, unless than rate is higher than the limit you put in, and then it will limit itself.
If you're anxious for Google to crawl your site more because a) you have something that's changed and you want Google to have it in their index, or b) because you're hoping it'll affect your rankings:
a) If there's specific information that you want Google to update its index with, submit the URL of the page that's new or changed into "Fetch as Googlebot" and then, once you fetch it, hit the "Submit to index" button to the right. I work on a site that's a DA 58 and fetching something as Googlebot updates the index within an hour.
b) How much Google crawls your site has to do with how important your site is; forcing Google to crawl your site more will not make it think your site is more important.
Hope this helps!
Kristina
-
Is selecting "Limit Google's maximum crawl rate" and then manually moving the rate to the highest (0.2 requests per second / 5 seconds between requests) a higher rate than selecting "Let Google optimize for my site (recommended)"? Google don't really expand on this! I want them to crawl at the very maximum but they don't tell us how many requests per second and seconds between requests are involved when selecting the optimized option.
-
You don't need to. Just let Google crawl at will. The only reason you would want to limit the crawl rate is if you're having performance issues from the server you're on (too much traffic at once). If you're not having any issues, then allow Google to crawl as many pages as they can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any set benefit in using a URL tracking engine on a domain for passing link juice?
Is there any set benefit in using a URL tracking engine on a domain for passing link juice? I.E. xxxx.com?$id=1111 to then redirect to shareasale? The client has an affiliate program and is thinking of running one in-house as well. Is there a benefit to a “redirect engine” that uses the website root domain?
Intermediate & Advanced SEO | | KellyBrady1 -
What is the best SEO tool to check internal linking structure
HI, Is there any tool to check how a website's internal linking structure has been linked. Some times few important pages may not linked very well and some links will be over linked. This will surge rankings...like if more links are pointing to one page? Is there any tool to check this?
Intermediate & Advanced SEO | | vtmoz0 -
Ratings Snippets Gone? ( Help! )
Hello We had good traffic from ratings ( stars ) . I have added Offer details in the rich snippets in various currencies - the snippet testing tool likes it , but for some reason the stars on my site have completely dissapeared and been gone for almost a week. I need the offer information in there for google shopping automatic updates and google told me that it's implemented correctly for the shopping part.. but I really don't know what to do about this. Any ideas why would be really appreciated. http://www.return2health.net/yeast-imbalance/threelac-candida-defence/ Thanks 🙂
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
URL Capitalization Inconsistencies Registering Duplicate Content Crawl Errors
Hello, I have a very large website that has a good amount of "Duplicate Content" issues according to MOZ. In reality though, it is not a problem with duplicate content, but rather a problem with URLs. For example: http://acme.com/product/features and http://acme.com/Product/Features both land on the same page, but MOZ is seeing them as separate pages, therefor assuming they are duplicates. We have recently implemented a solution to automatically de-captialize all characters in the URL, so when you type acme.com/Products, the URL will automatically change to acme.com/products – but MOZ continues to flag multiple "Duplicate Content" issues. I noticed that many of the links on the website still have the uppercase letters in the URL even though when clicked, the URL changes to all lower case. Could this be causing the issue? What is the best way to remove the "Duplicate Content" issues that are not actually duplicate content?
Intermediate & Advanced SEO | | Scratch_MM0 -
Why would my ip address show up in my webmaster tools links?
I am showing thousands of links from my servers ip address. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Crawl Issue for Deleted Pages
Hi, sometimes, I just delete a page and not necessarily want to make a 404 to another page. So Google Webmaster Tools shows me 108 'not found' pages under 'Crawling Errors'. Is that a problem for my site?
Intermediate & Advanced SEO | | soralsokal
Can I ignore this with good conscience?
Shall I make 404 to my homepage? I am confused and would like to hear your opinion on this. Best, Robin0 -
Meta No INDEX and Robots - Optimizing Crawl Budget
Hi, Sometime ago, a few thousand pages got into Google's index - they were "product pop up" pages, exact duplicates of the actual product page but a "quick view". So I deleted them via GWT and also put in a Meta No Index on these pop up overlays to stop them being indexed and causing dupe content issues. They are no longer within the index as far as I can see, i do a site:www.mydomain.com/ajax and nothing appears - So can I block these off now with robots.txt to optimize my crawl budget? Thanks
Intermediate & Advanced SEO | | bjs20100 -
Duplicate titles but redirecting anyway (without redirects set up!!!)
Google has done a crawl of my site and is flagging up duplicate titles on my wordpress site. This appears to be due to the face that some posts are tagged in more than one category. I have just gone to make sure that each post just has one category and add redirects and I've noticed that all the duplicate title issues google has notified me about appear to redirect anyway. For example: http://www.musicliveuk.com/latest-news/live-music-boosts-australian-economy and http://www.musicliveuk.com/live-music/live-music-boosts-australian-economy have duplicate titles apparantly but the 1st url redirects to the 2nd one. I use the redirection plug in but have no redirection set up for that url so I'm a bit confused. And if they're redirecting anyway then why is google flagging up duplicate titles? Any help would be much appreciated.
Intermediate & Advanced SEO | | SamCUK1