404 Errors
-
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
-
Hi Adul,
Just to follow up on this in case you're wondering why the answer is being downvoted. Blocking the pages that 404 in robots.txt will only stop Google getting a 404 because they can't reach the page. Users will still get a 404 so this isn't ideal. Also, if you don't 301 redirect the old pages to the new one, you lose any equity that those pages built up over the years.
Hope that helps,
Craig
-
Go to google webmaster tools, > Crawl > Crawl Error .. Download all page
Open excel and paste notepad robots.txt
User-agent: *
disallow: /page1.html
Disallow: /page2.html -
If no one can access your site except for the home page, that is pretty bad.
As to rankings, look at it from a broad perspective. A user clicks a link in search results. That link goes to a 404. They immediately go back and find someone else's site or link to click on. Another user clicks another link for the same broken site. They get a 404 error and do the same thing. Google bot comes along and sees that the site in question has a very low on-page time, and users frequently leave and go somewhere else. They also see a large quanity of the pages dont work.
If you were Google, would you give that site much weight or credit? Or would you hand it to a site that works? I don't think they openly express that it can hurt you, or that they will hurt your ranking for having 404 errors. IMO they do, it's just not as transparent as the rest of the things they state to do to improve your ranking.
-
OP, your case is an extreme one in that every page on the site but the homepage 404s. That means you moved but didn't do any 301 redirects, so that's an issue.
But generally, 404s have no impact on your site's ranking and that's been stated on record multiple times.
-
Hi, the 404 errors are pretty bad and en user experience standpoint and so Google does not like them. During domain migrations, the most important aspect is to control the number of 404 errors to the possible extent if not possible to make them zero.
When pages are moved, you should go in for a one-to-one or page-to-page server-side, 301 permanent redirection from the old pages to the corresponding new locations so that the old pages do not end-up in 404 errors and with 301 in place, Google will know that the old pages are no more in force and that they have been replaced by the new corresponding destinations. This will also make the old ones replaced by the new ones in search engine indices.
So to conclude, 404 errors are bad from both, the users and the search engines.
Hope it helps my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Href Lang Errors
Hi Im getting href lang errors in my site audit for having no self referntial tags. I can't see anything wrong? Can anyone enlighten me 🙂 From this page ** http://www.key.co.uk/en/key/0-4-tonne-wll-6mm-single-leg-wire-rope-sling-580a078n**
Intermediate & Advanced SEO | | BeckyKey0 -
Soft 404 error for a big, longstanding 301-redirected page
Hi everyone, Years ago, we acquired a website that had essentially 2 prominent homepages - one was like example.com and the other like example.com/htm... They served the same purpose basically, and were both very powerful, like PR7 and often had double listings for important search phrases in Google. Both pages had amassed considerable powerful links to them. About 4 years ago, we decided to 301 redirect the example.com/htm page to our homepage to clean up the user experience on our site and also, we hoped, to make one even stronger page in serps, rather than two less strong pages. Suddenly, in the past couple weeks, this example.com/htm 301-ed page started appearing in our Google Search Console as a soft 404 error. We've never had a soft 404 error before now. I tried marking this as resolved, to see if the error would return or if it was just some kind of temporary blip. The error did return. So my questions are:
Intermediate & Advanced SEO | | Eric_R
1. Why would this be happening after all this time?
2. Is this soft 404 error a signal from Google that we are no longer getting any benefit from link juice funneled to our existing homepage through the example.com/htm 301 redirect? The example.com/htm page still has considerable (albeit old) links pointing to it across the web. We're trying to make sense of this soft 404 observation and any insight would be greatly appreciated. Thanks!
Eric0 -
How to properly 404 pages from a subdomain
SO I am working on a site that had a subdomain that attracted a lot of spammy links. I researched the backlinks to this subdomain, and there were no beneficial links at all. I am thinking the best thing is to 404 this subdomain. What is the best way to do this? Should I just edit the DNS settings so that this subdomain does not point to the root domain? Or is there something that should be done in webmaster tools? Thanks in advance!
Intermediate & Advanced SEO | | evan890 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
404 ? 301 ? What is your opinion ?
Hi, I have a classifieds website and I am wondering about the life of a page with an ad. An announcement has therefore a limited life, so : Is a 404 pages? a 301 redirect to the section? let the content without redirection? What is your opinion? Sorry for my english, i'm french 😉 Thanks. A.
Intermediate & Advanced SEO | | android_lyon0 -
Wordpress error
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading: User-agent: *
Intermediate & Advanced SEO | | NileCruises
Crawl-delay: 20 User-agent: 008
Disallow: / I'm wondering how I can fix this and stop it happening again. The site was hacked about 4 months ago but I thought we'd managed to clear things up. Colin0 -
403, 301, 302, 404 errors & possible google penalty
William Rock ran a Xenu site scan on nlpca(dot)com and mentioned the following: ...ran a test with Xenu site scan and it found a lot of broken links with 403, 301, 302, 404 Errors. Other items found: Broken page-local links (also named 'anchors', 'fragmentidentifiers'): http://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#null anchor occurs multiple timeshttp://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#US not found Could somone give us an output of that list, and which ones of these errors do we need to clean up for SEO purposes? Thank you.
Intermediate & Advanced SEO | | BobGW0 -
Managing 404 errors
What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?
Intermediate & Advanced SEO | | SEOProPhoto0