Are 17000+ Not Found (404) Pages OK?
-
Very soon, our website will go a rapid change which would result in us removing 95% or more old pages (Right now, our site has around 18000 pages indexed).
It's changing into something different (B2B from B2C) and hence our site design, content etc would change.
Even our blog section would have more than 90% of the content removed.
What would be the ideal scenario be?
- Remove all pages and let those links be 404 pages
- Remove all pages and 301 redirect them to the home page
- Remove all unwanted pages and 301 redirect them to a separate page explaining the change (Although it wouldn't be that relevant since our audience has completely changed)- I doubt it would be ideal since at some point, we'd need ot remove this page as well and again do another redirection
-
Mohit,
Tom's advice will help you determine which pages are worth redirecting and which should just go to a 404 page (which should be customized instead of the browser/host default, and should also return a 404 response code in the http header!). My guess is that pages with links only from scraper sites aren't going to pass the tests laid out by Tom and thus would just go to a 404 page. However, any that have decent external links would fit the criteria and would be candidates for a 301 redirect.
-
Just to add a little to this great reply...
Here is how I would determine if it was worth my time to keep some of the old pages.
If the industry is the same but the end user is different, I would make EVERY attempt to keep those old pages. AuthorRank will matter in the future if you can contribute that information into a particular rel=publisher then I think it will be totally worth the time.
If, however, the information has nothing to do with the industry, then I wouldn't even consider taking the time to figure all of this out. I would have a kick ass 404 page to help people find your new stuff though.
Remember too that when you 301 redirect you do in fact loose some "link juice". (I really hate that phrase) So if the incoming links are of little to now value then a 301 will provide even less.
-
Hi Tom.. Thank you for your advice.
The thing is, we don't want to retain the users. They are not going to serve our cause anymore (We used to spend thousands of dollars every month on server costs just to keep up with teh load. now we are cutting it down- so unwanted users are not really something we want as it would result in load increase)
I'll surely follow your advice on OSE. The thing is, we have lot of link to the pages from scraper sites. I am not sure if it's worth keeping though.
-
Hi there
17,000 is quite a lot. I would look at maybe redirecting some of the URLs and I would do this based on certain criteria.
First of all, it helps to have a complete list of your current URLs. Screaming Frog is a great tool for this and is free.
Once you have your URLs, go into your analytics data and see which pages are attracting users. Take a sample size of about 2-3 months. If you're using Google analytics, click on traffic sources -> sources -> all traffic on the left-hand side.
When the dashboard loads, next to the "Primary Dimension" click other, and from the drop down menu click traffic sources, then landing page.
Any page with more than 5 or 10 visitors could be one worth redirecting. If these are pages that visitors might frequently use to get to your site, ensuring they are redirected might help to not interrupt their user journey. A 404 might put them off and go elsewhere.
Next, I'd look at what pages you might want to save to keep your SEO "strength". Put your URL into OpenSiteExplorer and then once done, click on "top pages". We're interested in the "Inbound Links" column here. Export the file into a CSV then sort the URL list in Excel by the Inbound Link total. You can filter here the pages with less links, so for instance you could remove the pages with 3 inbound links or less. It's a general way of doing things and isn't foolproof, but you will be left with a list of pages that could be getting decent PageRank/link equity. Manually check those pages and their backlinks and if you think they're acceptable, make sure you put in a 301 redirect.
Anything that doesn't match either of these criteria I would leave for a 404. You may be left with a lot, but Google knows that 404s are an accepted part of the course and won't penalise you for them. Check out this webmasters blog link.
Hope this helps with your decision making!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Identify Page Not Found Visits
Hello everyone! I have always known enough about Google Analytics and SEO to be dangerous, but was not a focus for me. I am working on a project were I am looking at stuff where my knowledge is limited. The scenario is that the domain I am looking at will serve a 404 error, but keeps the url, I guess for tracking purposes. At the same time, there is a page "Page_Not_Found" that has elevated visits. I am not sure how to tell where the visits are coming from to the PNF since the Previous Page is mostly identified as "(entrance)" Is the PNF correlated to the process of serving an error page but not changing the URL? Ideally, I am looking to identify and improve the 404 visits. I hope that I provided clear enough information. Happy to provide more as needed.
Intermediate & Advanced SEO | | HankHoffmeier0 -
One Page Design / Single Product Page
I have been working in a project. Create a framework for multi pages that I have So here is the case
Intermediate & Advanced SEO | | Roman-Delcarmen
Most of them are single page product / one page design wich means that I dont have many pages to optimize. All this sites/ pages follow the rules of a landing page optimization because my main goals is convert as many users as I can. At this point I need to optimize the SEO, the basic stuff such as header, descriptions, tittles ect. But most of my traffic is generated by affiliates, which is good beacuse I dont have to worrie to generate traffic but if the affiliate network banned my product, then I lose all my traffic. Put all my eggs in the same basket is not a good idea. Im not an seo guru so that is the reason Im asking whic strategies and tactics can give me results. All kind of ideas are welcome1 -
Duplicate page title at bottom of page - ok, or bad?
Can I get you experts opinion? A few years ago, we customized our pages to repeat the page title at the bottom of the page. So the page title is in the breadcrumbs at the top, and then it's also at the bottom of the page under all the contents. Here is a sample page: bit.ly/1pYyrUl I attached a screen shot and highlighted the second occurence of the page title. Am worried that this might be keyword stuffing, or over optimizing? Thoughts or advice on this? Thank you so much! ron ZH8xQX6
Intermediate & Advanced SEO | | yatesandcojewelers0 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Soft 404 problem
I have a soft 404 problem in webmaster tools for http://www.musicliveuk.com/about/feed and I'm not sure why. I read on here that if it is a main content page it should be fixed but I don't know how. I've tried to 301 redirect the page to http://www.musicliveuk.com/about/ but the redirect doesn't appear to be working? how do I fix this?
Intermediate & Advanced SEO | | SamCUK0 -
On Page question
HI folks, I have a warning that I have missing meta tag descriptions on two pages. 1) http://bluetea.com.au/wp-login.php 2) http://bluetea.com.au/wp-login.php?action=lostpassword Is this something I should just ignore? Or is there a best practice I should be implementing? Thank you for your time
Intermediate & Advanced SEO | | PHDAustralia680 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0