2.3 million 404s in GWT - learn to live with 'em?
-
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this.
Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT.
Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file.
Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis.
So here’s where my thought process is leading:
- Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index.
- Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis.
- Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time.
We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates.
Thoughts? If you think I’m off base, please set me straight.
-
I was actually thinking about some type of wildcard rule in htaccess. This might actually do the trick! Thanks for the response!
-
Hi,
Sounds like you’ve taken on a massive job with 12.5 million pages, but I think you can implement a simple fix to get things started.
You’re right to think about that sitemap, make sure it’s being dynamically updated as the data refreshes, otherwise that will be responsible for a lot of your 404s.
I understand you don’t want to add 2.3 million separate redirects to your htaccess, so what about a simple rule - if the request starts with ^/listing/ (one of your directory pages), is not a file and is not a dir, then redirect back to the homepage. Something like this:
does the request start with /listing/ or whatever structure you are using
RewriteCond %{REQUEST_URI} ^/listing/ [nc]
is it NOT a file and NOT a dir
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
#all true? Redirect
RewriteRule .* / [L,R=301]This way you can specify a certain URL structure for the pages which tend to turn to 404s, any 404s outside of your first rule will still serve a 404 code and show your 404 page and you can manually fix these problems, but the pages which tend to disappear can all be redirected back to the homepage if they’re not found.
You could still implement your 301s for important pages or simply recreate the page if it’s worth doing so, but you will have dealt with a large chunk or your non-existing pages.
I think it’s a big job and those missing pages are only part of it, but it should help you to sift through all of the data to get to the important bits – you can mark a lot of URLs as fixed and start giving your attention to the important pages which need some works.
Hope that helps,
Tom
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT app and link emulation
Hi, I have a gwt site - https://www.whatiswhere.com. I have tab control which emulates the menu. I am planning to put <a>links instead of text into tab labels to create internal links. I am thinking to add java script to stop onclick event of</a> <a>otherwise i will get to the new session of GWT site. What I want is to just change the tab but at the same time have a link for the crawler. Would my approach work? Will it be equivalent to non-follow link? Will it improve the ranking comparing to 'no link at all' case?</a> <a>Thanks, Andrei.</a>
Intermediate & Advanced SEO | | Anazar_20010 -
What's wrong with the algorithm?
Is it possible that Google is penalising a specific page and in the same time it shows unrelated page in the search results? "rent luxury car florence" shows https://lurento.com/city/munich/on the 2nd page (that's Munich, Germany) and in the same time completely ignores the related page https://lurento.com/city/florence/ How I can figure out if the specific page has been trashed and why? Thanks,
Intermediate & Advanced SEO | | lurento.com
Mike0 -
Canonical URL's searchable in Google?
Hi - we have a newly built site using Drupal, and Drupal likes to create canonical tags on pretty much everything, from their /node/ url's to the URL Alias we've indicated. Now, when I pull a moz crawl report, I get a huge list of all the /node/ plus other URL's. That's beside the point though... Question: when I directly enter one of the /node/ url's into a google search, a result is found. Clicking on it redirects to the new URL, but should Google even be finding these non-canonical URL's?? I don't feel like I've seen this before.
Intermediate & Advanced SEO | | Jenny10 -
I have 2 Questions
what if we do the interlinking on the exact keywords? Is this comes under spam technique? For example - http://blog.payscout.com/automotive-merchant-services/ I interlink the exact keyword in the above URL. Can we use same image 2-3 times on the same website with different anchor tags? For example - http://packforcity.com/what-to-wear-in-new-orleans-in-january/ http://packforcity.com/what-to-wear-in-san-francisco-in-october/ Same image used on the website with different alt tag.
Intermediate & Advanced SEO | | AlexanderWhite0 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Do Q&A 's work for SEO
If I create a good community in my particular field on my SEO site and have a quality Q&A section like this etc (ripping of MOZ's idea here sorry, I hope it's ok) will the long term returns be worth the effort of creating and man ageing this. Is the user created content of as much use as I think it will be?
Intermediate & Advanced SEO | | mark_baird0 -
How Long Before a URL is 'Too Long'
Hello Mozzers, Two of the sites I manage are currently in the process of merging into one site and as a result, many of the URLs are changing. Nevertheless (and I've shared this with my team), I was under the impression that after a certain point, Google starts to discount the validity of URLs that are too long. With that, if I were to have a URL that was structured as follows, would that be considered 'too long' if I'm trying to get the content indexed highly within Google? Here's an example: yourdomain.com/content/content-directory/article and in some cases, it can go as deep as: yourdomain.com/content/content-directory/organization/article. Albeit there is no current way for me to shorten these URLs is there anything I can do to make sure the content residing on a similar path is still eligible to rank highly on Google? How would I go about achieving this?
Intermediate & Advanced SEO | | NiallSmith0 -
Could targeting 2 geographic locations decrease rankings?
Hello, I think that us targeting 2 different geographic locations (San Francisco, CA and Salt Lake City, UT) is negatively effecting the rank of some of our main keywords. My evidence for this: Since December our main keyword (NLP) dropped in ranking for nlpca(dot)com from about 19th to about 40th. This is about when we started to really target 2 different locations. Other main keywords dropped a lot as well, like the important term "NLP Training" Also, our name, nlpca(dot)com indicates NLP California (CA stands for California in Google) The other day we bolded a sentence with the words "Salt Lake City, Utah" at the top of our content and in one of Google's Databases (the one I was looking at) we dropped in rankings for "NLP California" where we used to be completely sitelinked (where we took up a lot of space at the top of the search). Also, we shot up to 1st on my datacenter for both "NLP Utah" and "NLP Salt Lake City". At the same time, our rankings for the term "NLP" dropped off the map. It has come back up, but we've also targeted California again. A lot of our anchor text has the word "California" in it. We're thinking about building a separate site for Utah and just linking to it from the California website when we need to. Does it sound to you, in your experience, that targeting both locations in our case is what's causing a decrease in rankings? Thank you!
Intermediate & Advanced SEO | | BobGW0