404 Errors
-
Do 404 Errors really have a lot of impact on rankings and over all authority of the site with google? Say you have a site that all the pages have moved apart from the home page which is exactly the same before moving? So most of your pages are showing 404 errros.
-
Hi Adul,
Just to follow up on this in case you're wondering why the answer is being downvoted. Blocking the pages that 404 in robots.txt will only stop Google getting a 404 because they can't reach the page. Users will still get a 404 so this isn't ideal. Also, if you don't 301 redirect the old pages to the new one, you lose any equity that those pages built up over the years.
Hope that helps,
Craig
-
Go to google webmaster tools, > Crawl > Crawl Error .. Download all page
Open excel and paste notepad robots.txt
User-agent: *
disallow: /page1.html
Disallow: /page2.html -
If no one can access your site except for the home page, that is pretty bad.
As to rankings, look at it from a broad perspective. A user clicks a link in search results. That link goes to a 404. They immediately go back and find someone else's site or link to click on. Another user clicks another link for the same broken site. They get a 404 error and do the same thing. Google bot comes along and sees that the site in question has a very low on-page time, and users frequently leave and go somewhere else. They also see a large quanity of the pages dont work.
If you were Google, would you give that site much weight or credit? Or would you hand it to a site that works? I don't think they openly express that it can hurt you, or that they will hurt your ranking for having 404 errors. IMO they do, it's just not as transparent as the rest of the things they state to do to improve your ranking.
-
OP, your case is an extreme one in that every page on the site but the homepage 404s. That means you moved but didn't do any 301 redirects, so that's an issue.
But generally, 404s have no impact on your site's ranking and that's been stated on record multiple times.
-
Hi, the 404 errors are pretty bad and en user experience standpoint and so Google does not like them. During domain migrations, the most important aspect is to control the number of 404 errors to the possible extent if not possible to make them zero.
When pages are moved, you should go in for a one-to-one or page-to-page server-side, 301 permanent redirection from the old pages to the corresponding new locations so that the old pages do not end-up in 404 errors and with 301 in place, Google will know that the old pages are no more in force and that they have been replaced by the new corresponding destinations. This will also make the old ones replaced by the new ones in search engine indices.
So to conclude, 404 errors are bad from both, the users and the search engines.
Hope it helps my friend.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 or 410 status code after deleting a real estate listing
Hi there, We manage a website which generates an overview and detailpages of listings for several real estate agents. When these listings have been sold, they are removed from the overview and pages. These listings appear as not found in the crawl error overview in Google Search Console. These pages appear as 404's, would changing this to 410's solve this problem? And if not, what fix could take care of this problem?
Intermediate & Advanced SEO | | MartijntenCaat0 -
LinkedIn 999 HTTP Errors
I am working on a website, https://linkedinforbusiness.net and a ton of 999 HTTP errors have only now surfaced. I would venture from reading the "Request denied" error in the log, LinkedIn means to block BLCs attempts to check those links. It might be because the site has a lot of LinkedIn links; maybe they find it suspicious that the server is sending a lot of requests for their links. How do you see this? Any fixes? What level of harm do you think it brings to the site? I have removed all similar links to LinkedIn from my site to avoid this (https://www.hillwebcreations.com). However, this isn't so easily done for LinkedIn For Business, as her work in all about helping businesses and individuals optimize their use of LinkedIn.
Intermediate & Advanced SEO | | jessential0 -
How to find all 404 deadlinks - webmaster only allows 1000 to be downloaded...
Hi Guys I have a question...I am currently working on a website that was hit by a spam attack. The website was hacked and 1000's of adult censored pages were created on the wordpress site. The hosting company cleared all of the dubious files - but this has left 1000's of dead 404 pages. We want to fix the dead pages but Google webmaster only shows and allows you to download 1000. There are a lot more than 1000....does any know of any Good tools that allows you to identify all 404 pages? Thanks, Duncan
Intermediate & Advanced SEO | | CayenneRed890 -
Sitemap error
Hey Guys Everytime I run the tester through google webmaster tools - I keep getting an error that tells me "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." An idea how to go about fixing this without changing the site around? https://www.zenory.co.nz/sitemap I have seen competitors sitemaps look similar to mine. Cheers
Intermediate & Advanced SEO | | edward-may0 -
URL Capitalization Inconsistencies Registering Duplicate Content Crawl Errors
Hello, I have a very large website that has a good amount of "Duplicate Content" issues according to MOZ. In reality though, it is not a problem with duplicate content, but rather a problem with URLs. For example: http://acme.com/product/features and http://acme.com/Product/Features both land on the same page, but MOZ is seeing them as separate pages, therefor assuming they are duplicates. We have recently implemented a solution to automatically de-captialize all characters in the URL, so when you type acme.com/Products, the URL will automatically change to acme.com/products – but MOZ continues to flag multiple "Duplicate Content" issues. I noticed that many of the links on the website still have the uppercase letters in the URL even though when clicked, the URL changes to all lower case. Could this be causing the issue? What is the best way to remove the "Duplicate Content" issues that are not actually duplicate content?
Intermediate & Advanced SEO | | Scratch_MM0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Magento Hidden Products & Google Not Found Errors
We recently moved our website over to the Magento eCommerce platform. Magento has functionality to make certain items not visible individually so you can, for example, take 6 products and turn it into 1 product where a customer can choose their options. You then hide all the individual products, leaving only that one product visible on the site and reducing duplicate content issues. We did this. It works great and the individual products don't show up in our site map, which is what we'd like. However, Google Webmaster Tools has all of these individual product URLs in its Not Found Crawl Errors. ! For example: White t-shirt URL: /white-t-shirt Red t-shirt URL: /red-t-shirt Blue t-shirt URL: /blue-t-shirt All of those are not visible on the site and the URLs do not appear in our site map. But they are all showing up in Google Webmaster Tools. Configurable t-shirt URL: /t-shirt This product is the only one visible on the site, does appear on the site map, and shows up in Google Webmaster Tools as a valid URL. ! Do you know how it found the individual products if it isn't in the site map and they aren't visible on the website? And how important do you think it is that we fix all of these hundreds of Not Found errors to point to the single visible product on the site? I would think it is fairly important, but don't want to spend a week of man power on it if the returns would be minimal. Thanks so much for any input!
Intermediate & Advanced SEO | | Marketing.SCG0 -
404 Redirecting to the home page
One of my clients that is managing their own server and website recently moved servers. Which then broke their custom 404 page. Instead of fixing this or putting the site back to the old server they redirected the 404 to the home page. I've been working on getting their 404's appropriately redirected, or old urls redirection using a 301 for a month or two. I read the HTTP Status Codes best practices. It just discusses usability. What technical seo back lash can happen?
Intermediate & Advanced SEO | | triveraseo0