Has anyone else gotten strange WMT errors recently?
-
Yesterday, one of my sites got this message from WMT:
"Over the last 24 hours, Googlebot encountered 1 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 100.0%."
I did a fetch as Googlebot and everything seems fine. Also, the site is not seeing a decrease in traffic.
This morning, a client for which I am doing some unnatural links work emailed me about a site of his that got this message:
"Over the last 24 hours, Googlebot encountered 1130 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%."
His robots.txt looks fine to me.
Is anyone else getting messages like this? Could it be a WMT bug?
-
DNS servers are just like any other server, Marie - they can have outages, downtime and configuration problems.If the Googlebot visited while your DNS server was burping, it might have received no response, hence the error warning. When you checked, the server may have settled down.
There are a number of best practices for good DNS hygiene, but my primary one is to monitor the uptime of your DNS the same way you do the uptime of your website. I use my paid subscription to Pingdom Tools to do this as one of my checks, but I'm sure many other uptime monitoring tools can do it as well.
The reason I monitor is that it can be a really helpful early warning system for potential upcoming severe problems (and can help explain otherwise unexplained site outages). With one client, we saw a steadily increasing number of errors over a few days (over 40 outages on the last day), leading us to change DNS hosting before things could fail completely and leave us in the lurch.
In addition, I always recommend against having the DNS hosted on the same server as the website, as would happen with cPanel DNS hosting, for example. Reason being, if you have severe prolonged server issues, you can't get at your DNS to change it quickly to somewhere else temporarily (even if just to host an explanatory error message)
I also like to ensure the DNS is hosted somewhere with good geographic redundancy so even if one nameserver goes out, there are still multiple backups to keep things rolling. No matter how good your website's uptime is, if your DNS dies, you're still off line.
My guess is the DNS server was having temporary issues that resolved by the time you checked it. I'd want to be sure that wasn't happening on a regular basis. (relying on Google to report issues isn't nearly accurate or timely enough),
As far as the robots.txt - do you have uptime monitoring on that site? I can't count the number of new clients who thought things were fine with their website, when in fact they were having constant short outages that went unnoticed as they weren't on their own site constantly enough to catch it. I always recommend a system that checks at 1-minute intervals for just this reason. If you don't have independent verification that the site was fully up, you can't really discount the WMT warnings safely.
Lemme know if you want more info on uptime monitoring services & methods.
Paul
-
Yikes. That would not be good!
-
Wait until they tell you that they are taking your adsense account down in 72 hours... and you know that they have an algo problem... when you tell them that a noob employee who doesn't know the rules that you display ads under tells you that you are down to 48 hours.
-
Thx. All checks out well on the dns check. I'm calling this a bug.
-
I know it I've had much higher traffic spikes than anything I've seen recently and still they sent this message. It's bizarre! But definitely not one that worries me. I'm like you, I do not get excited when I see a message in that inbox...
Anyway just thought it fit because it was so strange and seemingly unnecessary.
-
If you checked and all is fine, it maybe a be a temporary bug, it happens from time to time.
The 100% rate could be that it was only one crawl, hence 100% error rate (1 of 1), just wait for the next crawl.Anyway, in the meantime, check your domain with a DNS checker, you can use www.dnsstuff.com, www.intodns.com or dnscheck.pingdom.com/ to make sure everything is working correctly or to see if there's anything you need to take to your hosting provider.
-
Haha Jesse! I'd rather have that message. That is a weird one. I have had things go super viral and I've never had a message telling me of an INCREASE in traffic. Some of them have been increased Google searches too...not just direct or Facebook visits.
I have had a message that there was a decrease in traffic for my top URL once. This was when one of my sites had a slight Panda hit.
I think the messages are very random.
Whenever I see a (1) next to messages in WMT my heart races a little. It's usually a good thing because I am waiting to hear back from a reconsideration request for a client though.
-
Well I got this message recently:
Search results clicks for http://www.---------- have increased significantly.
This message is not indicative of any problem in your site. It is simply to inform you that the number of clicks that one of your pages receives has increased recently. If you have just added new content, this may indicate that it has become more popular on Google. The number of clicks that your site receives from Google can change from day to day for a variety of factors, including automatic algorithm updates.I found it strange because everything looks about normal. Sure we had a bit better of a day than usual but just barely.. Nothing I'd even blink twice at.
It's strange because this is only the second time I've ever received a message in GWT. But hey, I'm not complaining about this one.
Probably unrelated to what you're describing but just thought I'd share.
Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yoast SEO. After set up 404 error pages
Hello all, Something strange happened with my blog site. I recently signed to MOZ tools. Initially everything was fine, but during my last crawl I got loads of 404
Technical SEO | | A_Fotografy
pages. Few days ago I was tweaking some settings in SEO plugin according to this post https://mza.bundledseo.com/blog/setup-wordpress-for-seo-success What I noticed was that 404 pages were coming from my blog posts, but for
some reason category was missing in those posts. For example this link is 404
https://a-fotografy.co.uk/inchcolm-island-wedding-photography-bailie The one with category is https://a-fotografy.co.uk/wedding-pictures/inchcolm-island-wedding-photography-bailie/ So basically for some reason category was missing. Please let me know how can I fix this instead of doing hundreds of
redirects now. Thank you,
Regards,
Armands0 -
Should I fetch in WMT with all 4 options?
When we ask google to Fetch a page, I usually just do the desktop one. However, should I be using the other 3 options as well? Mobile Smartphone, Mobile xHTML, and Mobile cHTML? I guess since they give you the options, just doing desktop means that it won't go to mobile until a regular crawl, but I just want to make sure that is the case. Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
Why would SEOMoz and GWT report 404 errors for pages that are not 404ing?
Recently, I've noticed that nearly all of the 404 errors (not soft 404) reported in GWT actually resolve to a legitimate page. This was weird, but I thought it might just be old info, so I would go through the process of checking and "mark as fixed" as necessary. However, I noticed that SEOMoz is picking up on these 404 errors in the diagnostics of the site as well, and now I'm concerned with what the problem could be. Anyone have any insight into this? Rich
Technical SEO | | secretstache0 -
Disappeared from Google with in 2 hours of webmaster tools error
Hey Guys I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it he made the below changes and within 2 hours the site has drop off the face of google “in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed” “I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools” I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex do you guys have any further advice ? Ben
Technical SEO | | elbeno1 -
Crawl Errors for duplicate titles/content when canonicalised or noindexed
Hi there, I run an ecommerce store and we've recently started changing the way we handle pagination links and canonical links. We run Magento, so each category eg /shoes has a number of parameters and pages depending on the number of products in the category. For example /shoes?mode=grid will display products in grid view, /shoes?mode=grid&p=2 is page 2 in grid mode. Previously, all URL variations per category were canonicalised to /shoes. Now, we've been advised to paginate the base URLs with page number only. So /shoes has a pagination next link to /shoes?p=2, page 2 has a prev link to /shoes and a next link to /shoes?p=3. When any other parameter is introduced (such as mode=grid) we canonicalise that back to the main category URL of /shoes and put a noindex meta tag on the page. However, SEOMoz is picking up duplicate title warnings for urls like /shoes?p=2 and /shoes?mode=grid&p=2 despite the latter being canonicalised and having a noindex tag. Presumably search engines will look at the canonical and the noindex tag so this shouldn't be an issue. Is that correct, or should I be concerned by these errors? Thanks.
Technical SEO | | Fergus_Macdonald0 -
How to resolve duplicate content and title errors?
Hello, I'm new to this resource and SEO. I have taken the time to read other posts but am not entirely sure about the best way to resolve the issues I am experiencing and so am hoping for a helpful hand. My site diagnostics advise me that most of my errors relate to duplicate content and duplicate page titles. Most of these errors seem to relate to our ecommerce product pages. A little about us first, we manufacture and retail over the internet our own line of unique products which can only be purchased through our website. So it’s not so important to make our product pages stand out from competitors. An example of one of our product pages can be found here: http://www.nabru.co.uk/product/Sui+2X2+Corner+Sofa In terms of SEO we are focusing on improving the rankings of our category pages which compete much more with our competitors, but would also like our product pages to be found via a google search for those potential customers that are at the late stage of a buying cycle. So my question: Whilst it would be good to add more content to the product pages, user reviews, individual product descriptions etc (and have good intentions to do this over time, which unfortunately is limited) is there an easy way to fix the duplicate content issues, ensure our products can be found and ensure that the main focus is on our category pages? Many thanks.
Technical SEO | | jannkuzel0