Seek help correcting large number of 404 errors generated, 95% traffic halt
-
Hi, The following GWT screen tells a bit of the story:
site: http://bit.ly/mrgdD0
http://www.diigo.com/item/image/1dbpl/wrbp
On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT.
My traffic had been steady at about 1000 clicks/day.
At midnight on 2/10, google traffic completely halted, down to 11 clicks/day.
I submitted a recon request and was told 'no manual penalty'
Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then.
By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th.
I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors!
I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors.
However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'.
I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$.
Assuming that the large number of 404 internal errors is the reason for the sudden shutoff...
How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'..
Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP??
thanks
-
Hello Rand, I've been facing a similar problem with my site. I'd really appreciate your response here - http://www.seomoz.org/q/help-fixing-the-traffic-drop-that-started-on-4-september-2012.
-
I wouldn't feel too confident that the numbers and dates Google's showing you are precise or accurate. In fact, we've seen times when GWMT is considerably off. I'd watch how Google crawls your site and look at search traffic to your pages - those are likely leading indicators that things are/will be fixed.
-
Thanks for the replies guys - - I had run Xenu on the site and it found no broken links... but still GWT error count continues to climb, and as of today
Google released a MUCH improved timeline view for the error count --- problem is, it's still showing 58K errors as of yesterday and climbing, long after I fixed them - and it wont show me where it thinks the source is...
These errors are all on internal pages BTW..
Heres the new google view
http://awesomescreenshot.com/0ef1gy6c7
The new GUI also includes a way to mark errors 'fixed' -- one by one!! I need to mark 60 thousand at once!
Also I can see the date these errors started appearing and it just doesnt make sense given that is the day my traffic started reappearing as well..
-
I agree with Rand's suggestions. I just ran a Screaming Frog crawl of the whole site on 10,233 links, 8997 URLs and got no 404s. So I think it's pretty safe to assume you've fixed the 404 issue. Here's the output of the crawl in case you'd like it for a reference: http://www.sendspace.com/file/7zui0v
I'd say:
- Definitely clean up and resubmit your XML sitemap
- Double check your backlink profile with Open Site Explorer and MajesticSEO to be sure that there aren't sites linking to URLs that no longer exist. If you find any of these make sure to 301 redirect them. Just take all the target URLs and dump them into Screaming Frog in list mode. All the links from OSE point to your homepage so they are not an issue, I don't have access to Majestic right now so I couldn't run those for you.
- You can now Submit pages in Google Webmaster Tools as well in the Fetch as Googlebot section. So you may consider submitting some of the new pages the site generates in addition to your reconsideration request to help get Google to re-crawl and find the 404s are gone.
Good luck man and please let us know if nothing changes after you implement these fixes.
-Mike
-
Hi Mark - wow, sounds really rough. I've got a few suggestions:
- First off, you need to make 100% sure that you've actually fixed the issue and that the internal links are pointing to the right places AND any old URLs that may have had internal/external links are either rel=canonicaling or 301 redirecting to the correct, updated locations.
- You might try using a few tools to verify this, including the SEOmoz Crawl Test http://pro.seomoz.org/tools/crawl-test and Screaming Frog: http://www.screamingfrog.co.uk/seo-spider/
- When you are ready, submit new XML Sitemaps to Google with the proper URLs. Make sure you've deleted/removed your old ones.
- You can also send the reconsideration request again, indicating that while you're aware this isn't a penalty, you have realized some technical/navigation issues on the site and believe you've now fixed these.
Hope this helps and wish you the best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NGINX 301 configuration - it is correct?
I'm totally not an expert in Technical Seo... but i am worry that my server admin neither is. Below is his vhost configuration, anyone can check this? it's this correct and SEO friendly? server { listen *:80; server_name domainaddress.pl domainaddress.com.pl; root /home/www/domainaddress.pl/web; index index.html index.htm key-words.php index.php index.cgi index.pl index.xhtml; location /key {
Technical SEO | | Nemo85
rewrite ^/key-words/$ http://domainaddress.pl/ permanent;
rewrite ^/key-words.php$ http://domainaddress.pl/ break;
} location / {
if ($http_host ~ "^www.domainaddress.pl"){
rewrite ^(.*)$ http://domainaddress.pl/$1 permanent;
} rewrite ^/key-words.php$ http://domainaddress.pl/ permanent;
} }0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
Can you 404 any forms of URL?
Hi seomozzers, <colgroup><col width="548"></colgroup>
Technical SEO | | Ideas-Money-Art
| http://ex.com/user/login?destination=comment%2Freply%2F256%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F258%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F242%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F257%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F260%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F225%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F251%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F176%23comment-form | These are duplicate content and the canonical version is: http://www.ex.com/user (login and pass page of the website) Since there were multiple other duplicates which mostly have been resolved by 301s, I figured that all "LOGIN" URLs (above) should be 404d since they don't carry any authority and 301 those wouldn't be the best solution since "too many 301s" can slow down the website speed. But a member of the dev team said: "Looks like all the urls requested to '404 redirect' are actually the same page http://ex.com/user/login. The only part of the url that changes is the variables after the "?" . I don't think you can (or highly not recommended) make 404 pages display for variables in a url. " So my question is: I am not sure what he means by that? and Is it really better to not 404 these? Thanks0 -
WordPress & Page Numbers
Hi, I am working on a large WP site for a client and have an issue with duplicate content and page numbers. I am using the Yoast SEO plugin but can't seem to resolve the issue. Let me give an example: If I go to a popular category, for example F1, there are over 10 pages of content for the category and although the URL changes, the Title and Meta Description stay the same. Now...if I was using a template for the title and description I could add the page number variable, but as I am overwriting the template with SEO specific category information I can't use variables and hence the problem! This is such a common problem I know somebody will have an answer! Thanks
Technical SEO | | JonathanSmith0 -
Most Common Errors & Warnings
Hello there, i would like to ask some basic tips.. regarding found common errors & Warnings. list : Tittle Element Too Long
Technical SEO | | Bretly
Duplicate Page Content
and Duplicate Page Tittle. how could i fixed this one? any help would be greatly appreciated regards,0 -
Fixing Missing MetaTag Errors
Hey all, I just had a crawl test done on my site(created using wordpress) and I received a ton of missing meta tag descriptions to fix. The odd thing is though I use "All in One" SEO Tool and the actual pages or posts on the site do have meta tag descriptions, however I noticed for every post an RSS Feed is being automatically generated and this Feed is the link missing the meta tag descriptions. Most of the errors display "Comments on" with a /feed in the end of the url. I am totally clueless on how to resolve these errors as I havent installed any WP plugins that generate feeds automatically. Has anyone encountered this problem before or know how to fix this?? The site url is http:// GovernmentGrantsAustralia . org I have left spaces above to avoid being a link dropper 🙂 Would really appreciate if anyone can help! FYI: I just found this link after digging through all the Q&A history, however I tried it and am not sure if it has worked as I still see the errors on my SEOmoz report. The link is:
Technical SEO | | justin99
http://www.seomoz.org/qa/view/41413/wordpress-missing-meta-description-tag-comments Hope someone can help me figure this one out! Thanks, Justin0 -
Expired traffic and 301 value
Hi Folks, Here is our situation we have an old brand domain www.asia-hotels.com >> that was redirecting to>>www.asiahotels.com By mistake, we let that domain expired and only noticed the drop a month later We lost all our pages and this for several weeks Not sure of the exact date but approximately around 24th of December, what a merry Xmas! 😞 Since then we have repurchased the domain, Put back all the pages as they were and re-instated all the 301 redirect as they were. Since that date we haven't seen any uplift in our visits or visibility score. Did we do something wrong with our 301 redirect? I know for sure we used ISAPI rewrite mod for the non www. domain although I am not entirely sure how the www. version has been handled. Is there something we should do at a DNS level to flag the site is back? Should we presetn a reconsideration request? Any help would be greatly welcomed. Thanks for your help. Cheers, Freddy More info I placed a bit more info and the visits graph on my blog: http://www.inmarketingwetrust.com.au/seo-effect-of-domain-expiry-on-301-redirects/ I am not sure if this is due to the fact that some information is cached but when i looked at the site on opensiteexplorer I found that the data is still showing as non redirected sites: http://www.opensiteexplorer.org/asia-hotels.com/www.asia-hotels.com/a!comparison effect-of-301-redirect-expired-on-SERP-visibility-300x204.jpg
Technical SEO | | Gus_Martin0