How to Destroy Old 404 Pages
-
Hello Mozzers,
So I just purchased a new domain and to my surprise it has a domain authority of 13 right out of the box (what luck!). I needed to investigate. To make a long story short the domain used to be home to a music blog that had hundreds of pages which of course are all missing now. I have about 400 pages on my hands that are resulting in a 404. How or what is the best method for eliminating these pages.
Does deleting the Crawl Errors in Google Webmaster Tools do anything?
Thanks
-
What a thorough response! I'm in the Option B scenario. The old content has nothing to do with my site so I don't need to redirect the old URLs. I will just wait out Google crawling those 404s.
Thanks!
-
You have a few options here. Option A is if you are going to build a site that will have similar topic based content as the old one and you want to use a larger portion of that domain authority from the old site to the new.
-
Pull those 404 errors from GWT in a spreadsheet. This gives you a corpus of links to work with.
-
Go into Bing WT and they have a way to browse what they have and had indexed. What is nice here is that Bing will tell you what URLs (even old 404s) have links to them.
-
Run your links through Open Site Explorer. You can then also get linking data, FB and Twitter data in addition to OSE data on the old URLs
-
If need be, run the more important dead URLs through the Wayback Machine http://archive.org/web/web.php you can now even see what the actual content was on the old URLs.
-
After doing all of this, pretty quick you should be able to see if there were any authority pages on the site that have now expired and you also know what those pages were about via the wayback machine.
-
On the authority pages, create new pages on the new site that have to do with the same topic, i.e. semantically related to the old page.
-
301 the old authority pages to the new authority pages.
-
The rest of the URLs you can just let them 404. They will continue to 404 several time until Google drops them. I would leave them in GWT as over time they should drop out as Google starts to ignore those pages, this may take a few months. You can then just check GWT for any new 404s that might show up from the new site and you need to deal with.
One thing to note on all of this. You may have to let the old sitemap 404 vs redirecting the sitemap.
http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
"One frustrating thing that Google does is it will continually crawl old sitemaps that you have since deleted to check that the sitemap and URLs are in fact dead. If you have an old sitemap that you have removed from Webmaster Tools, and you don’t want being crawled, make sure you let that sitemap 404 and that you are not redirecting the sitemap to your current sitemap."
If you delete the 404s from GWT the next time Google spiders the old pages they will just show up again, up to you then.
Option B - if you dont care about the old pages, just let them 404 as mentioned above, but be aware of the issue with old sitemaps. You can check the Google index for old URLs in the SERPs or also if you look into GWT and look for data on your Search Traffic. Make sure that the old URLs are not showing up under your Search Queries.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links to the same page can there be for each page?
I need to know if I can add more than 2 equal links on the same page, for example 1 link in the header, another in the body and one in the footer
Intermediate & Advanced SEO | | Jorgesep0 -
Page speed - what do you aim for?
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Intermediate & Advanced SEO | | McTaggart
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec) Thanks, Luke0 -
Sitemaps and dynamic pages
Hi all, I have a gigantic website and they are adding another subdirectory to it. My question is regarding html sitemaps for better optimisation. 1. Should a keyword focussed front end (html) sitemap be made for all the dynamic URLs or 2. Should a category focussed front end (html) sitemap be made for all the dynamic URLs what would be your approach to doing a sitemap with thousands of pages with a structure like Directory > Sub directory > Subdirectory > Files
Intermediate & Advanced SEO | | Malika10 -
Google Page Speed
Is it worthwhile going after a good score on Google page speed? Had prices but a LOT of money, and don't know if it's worth it or not. Also to add to the complication it is a new site. Does anyone have any experience if it helps rankings? Thanks
Intermediate & Advanced SEO | | seoman100 -
Drop in indexed pages!
Hi everybody! I've been working on http://thewilddeckcompany.co.uk/ for a little while now. Until recently, everything was great - good rankings for the key terms of 'bird hides' and 'pond dipping platforms'. However, rankings have tanked over the past few days. I can't point my finger at it yet, but a site:thewilddeckcompany.co.uk search shows only three pages have been indexed. There's only 10 on the site, and it was fine beforehand. Any advice would be much appreciated,
Intermediate & Advanced SEO | | Blink-SEO0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Pricing Page vs. No Pricing Page
There are many SEO sites out there that have an SEO Pricing page, IMO this is BS. A SEO company cannot give every person the same quote for diffirent keywords. However, this is something we are currently debating. I don't want a pricing page, because it's a page full of lies. My coworker thinks it is a good idea, and that users look for a pricing page. Suggestions? If I had to build one (which I am debating against) is it better to just explain why pricing can be tricky? or to BS them like most sites do?
Intermediate & Advanced SEO | | SEODinosaur0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0