503 Error or 200 OK??
-
So, in a Moz crawl and a Screaming From crawl, I'm getting some 503 Service Unavailable responses on the some pages.
So I go to the pages in question, and the Moz bar is showing a 200 OK. The SEOBook http status checker (http://tools.seobook.com/server-header-checker/) also shows a 200 OK.
What gives?
The only reason I'm looking at this is because rankings plummeted a couple of weeks ago.
Thanks!
UPDATE
So, I decided to use the mozbar to set the user agent as Googlebot and when I tried to access the pages in question I receive this message. I don't think this is an issue... anyone else have much experience here?
Your access to this site has been limited
Your access to this service has been temporarily limited. Please try again in a few minutes. (HTTP response code 503)
Reason: Fake Google crawler automatically blocked
Important note for site admins: If you are the administrator of this website note that your access has been limited because you broke one of the Wordfence firewall rules. The reason you access was limited is: "Fake Google crawler automatically blocked".
If this is a false positive, meaning that your access to your own site has been limited incorrectly, then you will need to regain access to your site, go to the Wordfence "options" page, go to the section for Firewall Rules and disable the rule that caused you to be blocked. For example, if you were blocked because it was detected that you are a fake Google crawler, then disable the rule that blocks fake google crawlers. Or if you were blocked because you were accessing your site too quickly, then increase the number of accesses allowed per minute.
If you're still having trouble, then simply disable the Wordfence firewall and you will still benefit from the other security features that Wordfence provides.
If you are a site administrator and have been accidentally locked out, please enter your email in the box below and click "Send". If the email address you enter belongs to a known site administrator or someone set to receive Wordfence alerts, we will send you an email to help you regain access. Please read our FAQ if this does not work.
-
The problem is that plugin you are using, "Wordfence". It is probably picking up the crawls from Screaming Frog and Moz as DoS attacks because of the amount of requests from the same IPs.
You could either see if the plugin IP whitelisting or why don't you just remove that plugin and use CloudFlare, which is free and offer an even more robust security option + an included CDN.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Account Error
Hey I have Moz account free trail for 30 days whenever i analyze website that is based on machines https://lattemachinehub.com/ i show error please solve my problem.
Moz Pro | | alihamughal6930 -
Since July 1, we've had a HUGE jump in errors on our weekly crawl. We don't think anything has changed on our website. Has MOZ changed something that would account for a large leap in duplicate content and duplicate title errors?
Our error report went from 1,900 to 18,000 in one swoop, starting right around the first of July. The errors are duplicate content and duplicate title, as if it does not see our 301 redirects. Any insights?
Moz Pro | | KristyFord0 -
How can i locate the links on my site that are causing 404 errors?
In the 404 error report, there is no way to find what page on my site has the broken link.. how can i find them??? help. Matt
Moz Pro | | seo4anyone0 -
Duplicate content error?
I am seeing an error for duplicate content for the following pages: http://www.bluelinkerp.com/contact/ http://www.bluelinkerp.com/contact/index.asp Doesn't the first URL just automatically redirect to the default page in that directory (index.asp)? Why is it showing up as separate duplicate pages?
Moz Pro | | BlueLinkERP0 -
Crawler reporting incorrect URLs, resulting in false errors...
The SEOmoz crawler is showing 236 Duplicate Page Titles. When I go in to see what page titles are duplicated I see that the URLs in question are incorrect and read "/about/about/..." instead of just "/about/" The shown page duplicates are the result of the crawler is ending up on the "Page not found" page. Could it be the result of using relative links on the site? Anything I can do to remedy? Thanks for your help! -Frank
Moz Pro | | Clements1 -
Site is penalized yet Google WTC and SEOMOZ says it's ok!
My first question on this community.. A high traffic site of mine has been severely hit for over two months. I took a look at GWTC and noticed a lot of suggestions on the HTML suggestions tab, where I believed that one or a combination of these errors/notices might been causing the penalty. At the moment of typing GWTC shows no HTML suggestions and minor meaningless crawl errors. Checking on SEOMOZ reported issues, I see zero errors except for a high number of duplicate page titles resulting from tag pages being accessible in the same tag title in paginations (tag.html?page=x) which can be fixed easily by using rel=canonical but I just know it can't cause such penalty. For reported warnings, site has a decent amount of title element being too long and missing meta tag description tags. I have been very active in the past with 301 inner pages of the site to make pages URLs as unique as possible and results were great, but that was long time ago before taking the hit. The penalty the site suffers from is zero rankings for newly added contents and old contents as well. It has also lost all rankings for its money terms. Site is not banned and still ranking top for its name which is a good thing. I am not sure where else I should be looking and will appreciate any good advices. Thanks!
Moz Pro | | HossamHossny0 -
What causes Crawl Diagnostics Processing Errors in seomoz campaign?
I'm getting the following error when seomoz tries to spider my site: First Crawl in Progress! Processing Issues for 671 pages Started: Apr. 23rd, 2011 Here is the robots.txt data from the site: Disallow ALL BOTS for image directories and JPEG files. User-agent: * Disallow: /stats/ Disallow: /images/ Disallow: /newspictures/ Disallow: /pdfs/ Disallow: /propbig/ Disallow: /propsmall/ Disallow: /*.jpg$ Any ideas on how to get around this would be appreciated 🙂
Moz Pro | | cmaddison0 -
Most of the time getting error.
Hi, i am getting this error most of the time in linkscape since last month. Sorry dude, no inlinks found matching this criteria. Pl guide is this a bug and the sites I am trying to use linkscape for were having lot of pages crawled earlier by SEOMOZ. Thanks, Preet
Moz Pro | | PreetSibia0