60,000 404 errors
-
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place.
Thanks
-
Hi there
Check your sitemap and update your internal links. This usually helps takes care of a major portion of your problem.
From there, check your backlinks and make sure you update those that are relevant to your site and remove those that are not.
I would take a look at the 404 list - see what pages you could easily redirect to relevant pages the user would still find valuable, and custom 404 the others. You could also 410 pages, but I would use that wisely, because it's not really that necessary.
Lastly, here's a great resource from Matt Cutts on SEW that helps SEOs handle eCommerce 404s.
Hope this helps! Good luck!
-
I had similar issues on large e-commerce websites, where these pages were not in navigation, but were in Google's index, so Webmaster Tools reported tens of thousands of 404s.
Keeping this many 301 redirections would have put a large load on the server, so we made sure that navigation and site search doesn't link to these pages and later Google removed them from its index.
On the other hand, Peter is right. It depends on the ratio of live and broken pages. If you have 100k pages and 60k is 404, then Google will most likely ignore your website's crawling and indexing of new pages. You need to have a very dodgy website to trigger a Panda penalty, though. 404 happens all the time and Google is quite patient with the website admins to fix it.
-
Yes - large scale 404 can trigger Panda:
http://themoralconcept.net/pandalist.htmlBut don't 301 them. Just fix links from source pages where clicking lead to 404 - replace them with other pages similar to missing one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Error after scanning with browseo.net
Good day! I have done a scan on my site with browseo.net ( and a few other similar scanners ) and got the mess seen in the screenshot. I've tried deleting all the files in the website folder, replace it with a single image file, but it still shows the same error. What could this mean and should i be worried? P.S Found my answer after contacting the helpful support of browseo.net : It took me some time to figure out what was going on, but it seems as if you are mixing content types. Browsers are quite smart when it comes to interpreting the contents, so they are much more forgiving than we are. Browseo crawls your website and detects that you are setting utf-8 as part of the meta information. By doing so, it converts the content in a different character encoding then what they are supposed to be. In a quick test, I tried to fetch the content type based on the response object, but without any success. So I am suspecting that in reality your content is not utf-8 encoded when you parse it into joomla. The wrong character type is then carried over for the body (which explains why we can still read the header information). All of this explains the error. In order for it to work in browseo, you’d have to set the content type correctly, or convert your own content into utf-8 before parsing. It may be that you are either storing this incorrectly in the database (check your db settings for a different content type other than utf-8) or that other settings are a bit messed up. The good news is, that google is probably interpreting your websites correctly, so you won’t be punished for this, but perhaps something to look into… From Paul Piper VKNNnAL.png?1
Technical SEO | | AlexElks0 -
Crawl Attempt Errors & Homepage Not Ranking
Hi all, I have scanned the community forum thoroughly to find a solution to this issue and noticed some detailed and informed responses, but I am not sure which apply to the issue we are currently having. We are receiving a lot of 803 Crawl Attempt Errors on a weekly basis for our site www.mangofurniture.co.uk and also our homepage isn't ranking and I can't help but think that the two are linked. We have some rankings for the internal pages and have a couple of other sites that use the same template as www.mangofurniture.co.uk that are doing well with no crawl attempt errors and strong homepage rankings. There are a lot of great resources out there on the Moz forum and elsewhere but I am little unsure what applies to our problem or whether to two are linked at all. We have tried rewriting the homepage and developing the internal linking system but to no success as yet. Also, because the site is fairly new so the link profile is quite small at present. Any advice regarding this would be greatly appreciated. Many thanks in advance.
Technical SEO | | FurnitureGeek0 -
404 to 301 redirects is there a limit?
Hi We've just updated our website and have binned out alot of old thin content which has no value even if re written. We have a lot of 404 error on WMT and I am in the process of doing 301 redirects on them. Is there a limit to the number of 301 the site should have?
Technical SEO | | Cocoonfxmedia0 -
23,000 pages indexed, I think bad
Thank you Thank you Moz People!! I have a successful vacation rental company that has terrible seo but getting better. When I first ran Moz crawler and page grader, I had 35,000 errors and all f's.... tons of problem with duplicate page content and titles because not being consistent with page names... mainly capitalization and also rel canonical errors... with that said, I have now maybe 2 or 3 errors from time to time, but I fix every other day. Problem Maybe My site map shows in Google Webmaster submitted 1155
Technical SEO | | nickcargill
1541 indexed But google crawl shows 23,000 pages probably because of duplicate errors or possibly database driven url parameters... How bad is this and how do I get this to be accurate, I have seen google remove tool but I do not think this is right? 2) I have hired a full time content writer and I hope this works My site in google was just domain.com but I had put a 301 in to www.domain.com becauses www. had a page authority where the domain.com did not. But in webmasters I had domain.com just listed. So I changed that to www.domain.com (as preferred domain name) and ask for the first time to crawl. www.domain.com . Anybody see any problems with this? THank you MOZ people, Nick0 -
Error 404, Wordpress adds the domain automaticly to the end of the pages, WHY?
Hello guys, I'm using wordpress and the Yoast to help me improve my SEO. Everything went well except for today because "Moz" found 404 errors when scrolling the website saying showing the domain of my website at the end of 12 url. For example :
Technical SEO | | abonnisseau
www.domain.com/service-1/www.domain.com www.domain.com/contact-page/**www.domain.com ** Do you have any idea where does that come from ? Thanks Alex0 -
Thoughts about stub pages - 200 & noindex ok, or 404?
With large database/template driven websites it is often possible to get a lot of pages with no content on them. What are the current thoughts regarding these pages with no content, options; Return a 200 header code with noindex meta tag Return a 404 page & header code Something else? Thanks
Technical SEO | | slingshot0 -
Crawl Errors
Okay, I was just in my Google Webmaster Tools and was looking at some of the stats. I have 1354 "not found" pages google says. Many of these URL's are bizarre. I don't know what they are. Others I do know. What should I do about this? Especially all the URL's I don't even know what they are?
Technical SEO | | azguy0 -
404 help
Hello all, firstly let me apologize if this is the wrong place to ask this question. I have a site www.promptresponseaccidentmanagement.com which gets a 200ok when checked for crawl issues, however pages such as /whiplash-injury-compensation-claims.php , /road-traffic-accident-compensation-claims.php and quite a few more return a 404. That's fine (usually) as I can quite happily fix that most of the time. However if you actually go to those pages in your browser, or click through to them on any part of the site you will see that they are in fact not redirecting to a 404 and everything is fine!? Any body got any ideas? Best H
Technical SEO | | haydyn0