Intermittent 404 - What causes them and how to fix?
-
Hi!
I'm working on a client site at the moment and I've discovered a couple of pages that are 404ing but producing a 200 OK response.
However, I have checked these URLs again and some are now producing a 404 Error response. No changes have been made (that I'm aware of) so it appears that the URLs are returning both 200 OK and 404 Error responses intermittently.
Any ideas what could cause this and the best solution?
Thanks!
-
Hi Daniel,
That does seem very odd!
There can be various different things at play here in my experience:
- Publishing and un-publishing in the CMS - Sometimes clients and CMS users will switch the page publish status around which can make it difficult to keep track of whether the page exists or not
- Removal of redirects - Much the same as above, there's the possibility that someone is adding/removing redirects. This could also be something to do with the removal of the server level redirects file, be that htaccess or IIS.
- Crawling tool issue - If you're using a crawling tool (especially if doing so with JS rendering active), sometimes this can put a lot of pressure on the server and it can spit out the occasional error where there actually isn't one
It could also be a problem with how you're handing hard 404 errors vs soft 404s - i.e. actual not founds vs pages that don't function but the server is under the impression that they're fine.
Best of luck!
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Product Documentation Causing 23-40K issues
One of my biggest hurdles at my company is our Product Documentation library, which houses thousands of pages of publicly accessible and indexed content on old and new versions of our product. Every time a product name changes the URL changes, causing a 404, so I typically have 100s of 404s every few months from this site. It's housed off our main domain. We have 23,000+ Duplicate Pages, 40,000 missing meta descriptions, and 38,000 due to this library. It is not built the same as our main content, with page titles and meta descriptions, so everything is defaulted and duplicate. I'm trying to make a case that this is an issue, especially as we migrate our site next year to a new CMS. Does anyone have any suggestions for dealing with this issue in the short term and long term? Is it worth asking the owners of the section of content to develop page titles and meta descriptions on 40,000 pieces of content? They do not see the value of SEO and the issues this can cause. It needs to be publicly accessible, but it's not highly ranked content. It's really for customers who want to know more about the product. But I worry it is hurting other parts of our site, with the absurd amount of duplicate content, meta, and page title issues.
Technical SEO | | QAD_ERP0 -
20 000 duplicates in Moz crawl due to Joomla URL parameters. How to fix?
We have a problem of massive duplicate content in Joomla. Here is an example of the "base" URL: http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html For some reason Joomla creates many versions of this URL, for example: http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html or http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html So it lists the URL parameter ?q= and then repeats part of the beforegoing URL. This leads to tens of thousands duplicate pages in our content heavy site. Any ideas how to fix this? Thanks so much!
Technical SEO | | Xmanic0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Why is it that in the exported CSV there are no refrerring pages shown for 404 errors?
Within some of my campaigns i can see issues regarding 404 pages. Then when i export the data to a csv, sometimes the referring pages that lead tot the 404 are not shown. Am i missing something here?
Technical SEO | | 5MMedia0 -
Could Having Blog Posts as Home Page Cause Keyword Dilution?
Something I've never been a fan of is having a blog as the home page of a site. I've always thought that it's a bit like walking into someone's house through the kitchen out back.
Technical SEO | | WilliamBay
If it's a vistors first time, it can be a little disconcerting or ackward even if they are not familiar with the writers style. But something just dawned on me, and I'd love a second opinion on this. For websites that focus on multiple keywords (in my most of my client's case it's usually a mix of Wedding Photography, Engagement Photography, Portrait Photography, Family Photography, etc). A lot of these clients will include the photos in a blog post along with a snippet of text that may talk about the people they're photographing and maybe a bit about where they photographed. But they're usually optimizing for the overarching keyword (Wedding... Portrait..., etc as per above). Now I'm wondering if having three or 5 posts on the home page, where most of them are focusing on a specific keyword like New York Wedding Photographer, is actually diluting the keyword they are trying to rank for. My theory is that if I have them move their blog to a domain.com/blog, and solely focus on the desired keyword on the home page, that they would do substantially better in the SERPs. Can anyone subtantiate this? Thanks!0 -
How to proceed on this 404 situation
Hi folks, I have a website (www.mysite.com) where I can't host a 404 page inside the subdomain www because a CMS issue. In order to provide a 404 page, I can create a subdomain like “404.mysite.com” that returns 404 and then if I find that a page does not exist I can redirect it to 404.mysite.com. My question is: should I redirect as 301 or 302? Does it have any difference?
Technical SEO | | fabioricotta-840380