When do you use 'Fetch as a Google'' on Google Webmaster?
-
Hi,
I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only?
I've googled it but i got confused more. I appreciate if you could help.
Thanks
-
I hazard to say that if the new product was in the sitemap it would have also appears in the SERPs. We submit sitemaps every day and products are in the index within hours.
I guess the GWMT manual submission is okay if you need to manually fix some pages, but then it asks the question, how your SEO efforts could not make those visible to bots (via link structure or sitemaps).
-
Thanks Gerd, it's a bit more clear now. Appreciate your help.
-
Thanks Frank, appreciate your help
-
Thank you so much for your reply. I am a bit more clear now what to do. Appreciate your help.
-
Sida, what I meant is that I use the Google Webmaster Tool function "Fetch as Google" only as a diagnostic function to see how GoogleBot receives a request from my website.
It seems that people fetch URLs via the GWMT "Fetch as Google" and then use the function to submit it to the index. I find that not a good idea as any new content should either be discoverable (via SEO) or should be submitted to Google automatically via a sitemap (hinted in robots.txt)
-
Thanks Gerd, Would you mind clarifying a bit more what 'diagnostic tool' is and if you recommend a name as well, that'll be fantastic.
-
Use it as a "diagnostic tool" to check how content or error pages are retrieved via the bot. I specifically look at it from a content and HTTP-status perspective.
I would not use it to submit URLs - for that you should rather use a sitemap file. Think of "Fetch as Google" as a troubleshooting tool and not something to submit pages to an index.
-
Here's an oh-by-the-way.
One of our manufacturer's came out with a product via slow roll literally within the last 5 days. They have not announced the release of it to the retailers. I happened to stumble on it visiting their site while updating products.
I did a search of the term and found I wasn't the only one unaware of it so I scrambled to add the product to the site, promote it and submit it to the index late Tuesday.
It's Thursday and its showing in SERPs.
Would it have appeared that quickly if I didn't submit it via fetch? I don't know for sure but I'm inclined to think not. Call me superstitious.
Someone debunk the myth if you can. One less thing for me to do.
-
If I add a lot a product/articles I just do a sitemap re-submit but if I only add one product or article I just wait till the bots crawl to that links. It usually takes a couple of day before it gets indexed. I never really used the fetch as google unless I made changes to the structure of the website.
Hope this helps.
-
I submit every product and category I add.
Do I have to? No. Is it necessary? No - we have an xml sitemap generator. Google is like Big Brother - he will find you. Fetch is a tool that you can use or not use.
Will Google find it faster and will you show up more quickly in search results if you submit it? I don't know.
-
Thank you AWC, I've read that article arlready but I am not quite sure is that how often this feature should be used. I think i should be more specific..If you have a ecommerce website and adding a product every 2-3days, would you submit the link every time you add a new item ? When you publish a blog article on your website, would you submit it immediately?
-
I think GWT explains it very well.
https://support.google.com/webmasters/answer/158587?hl=en
I typically use it to submit new pages to the index although its probably not necessary if you have an xml sitemap. Not certain on that one.
More tech savvy folks probably use it to also check the crawlability and "health" of pages.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
Technical SEO | | farhanm1 -
Google has deindexed 40% of my site because it's having problems crawling it
Hi Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper). The site i'm talking about is http://www.gazetaexpress.com/ We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't? In the screenshot attached to this post you will see how Google Webmasters is reporting these errors. In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem? If you need more details feel free to ask. I will appreciate any help. Thank you in advance C43svbv.png?1
Technical SEO | | Bajram.Kurtishaj1 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Can I have an http AND a https site on Google Webmaster tools
My website is https but the default property that was configured on Google WMT was http and wasn't showing me any information because of that. I added an https property for that, but my question is: do I need to delete the original HTTP or can I leave both websites?
Technical SEO | | Onboard.com0 -
'No Follow' and 'Do Follow' links when using WordPress plugins
Hi all I hope someone can help me out with the following question in regards to 'no follow' and 'do follow' links in combination with WordPress plugins. Some plugins that deal with links i.e. link masking or SEO plugins do give you the option to 'not follow' links. Can someone speak from experience that this does actually work?? It's really quite stupid, but only occurred to me that when using the FireFox add on 'NoDoFollow' as well as looking at the SEOmoz link profile of course, 95% of my links are actually marked as FOLLOW, while the opposite should be the case. For example I mark about 90% of outgoing links as no follow within a link masking plugin. Well, why would WordPress plugins give you the option to mark links as no follow in the first place when they do in fact appear as follow for search engines and SEOmoz? Is this a WordPress thing or whatnot? Maybe they are in fact no follow, and the information supplied by SEO tools comes from the basic HTML structure analysis. I don't know... This really got me worried. Hope someone can shed a light. All the best and many thanks for your answers!
Technical SEO | | Hermski0 -
Google Webmaster tools vs SeoMOZ Crawl Diagnostics
Hi Guys I was just looking over my weekly report and crawl diagnostics. What I've noticed is that the data gathered on SeoMoz is different from Google Webmaster diagnostics. The number of errors, in particular duplicate page titles, content and pages not found is much higher that what google webmaster tools is represents. I'm a bit confused and don't know which data is more accurate. Please Help
Technical SEO | | Tolod0 -
Google & Separators
This is not a question but something to share. If you click on all of these links and compare the results you will see why _ is not a good thing to have in your URLs. http://www.google.com/search?q=blue http://www.google.com/search?q=b.l.u.e http://www.google.com/search?q=b-l-u-e http://www.google.com/search?q=b_l_u_e http://www.google.com/search?q=b%20l%20u%20e If you have any other examples of working separators please comment.
Technical SEO | | Dan-Petrovic3