Issue with 'Crawl Errors' in Webmaster Tools
-
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10.
Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected.
Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not.
Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?!
Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!!
Thanks.
-
Thanks both for your responses. It's a strange one and I can only assume that these pages remain in Google's index - I have checked many link sources and found that the links do not exist and therefore haven't done since the page was deleted. It seems ridicilous that you should have to 301 every page you delete, there are literally 500+ of these phantom links to non-existant URLs and the site is changing all the time.
I have opted to add a 'no index' meta to the 404s and also encourage them to delete from index by adding the pages to the robots.txt file.
Let's see if it works - I'll post on here when I know for sure so other people with the same question can see the outcome.
Thanks again, Damien and Steven.
-
Completely agree with Damien. If they don't exist but Webmaster Tools is showing them, 301 them, there has to be a link somewhere on the internet that is causing them to think there is. I would also go through the server logs to see if there is any additional information like a referring page to the non-existent ones.
-
Hey,
I guess if you've exhausted all other possibilities you can either let them return a 404 and leave them be which will most likely do you no harm or 301 that particular URL to another relevant page on your site.
Make sure they are actually returning a 404 first though via header response check.
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Question about a Screaming Frog crawling issue
Hello, I have a very peculiar question about an issue I'm having when working on a website. It's a WordPress site and I'm using a generic plug in for title and meta updates. When I go to crawl the site through screaming frog, however, there seems to be a hard coded title tag that I can't find anywhere and the plug in updates don't get crawled. If anyone has any suggestions, thatd be great. Thanks!
Technical SEO | | KyleSennikoff0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Hreflang Tags - error: 'en' - no return tags
Hello, We have recently implemented Hreflang tags to improve the findability of our content in each specific language. However, Webmaster tool is giving us this error... Does anyone know what it means and how to solve it? Here I attach a screenshot: http://screencast.com/t/a4AsqLNtF6J Thanks for your help!
Technical SEO | | Kilgray0 -
Crawl issues
Hello there, I have found that when crawling my site I have errors regarding the meta description and it says it is missing from few pages. I checked these pages but there is a meta description. I also ran the same report with other tools and it comes up the same issues. What should I do?
Technical SEO | | PremioOscar0 -
What should I do with a large number of 'pages not found'?
One of my client sites lists millions of products and 100s or 1000s are de-listed from their inventory each month and removed from the site (no longer for sale). What is the best way to handle these pages/URLs from an SEO perspective? There is no place to use a 301. 1. Should we implement 404s for each one and put up with the growing number of 'pages not found' shown in Webmaster Tools? 2. Should we add them to the Robots.txt file? 3. Should we add 'nofollow' into all these pages? Or is there a better solution? Would love some help with this!
Technical SEO | | CuriousCatDigital0 -
How to fix this issue?
I redesign my website from Wix to HTML. Now URLs changed to _ http://www.spinteedubai.com/#!how-it-works/c46c To http://www.spinteedubai.com/how-it-works.html Same for all other pages. How I can fix this issue and both pages were also indexed in google.
Technical SEO | | AlexanderWhite0 -
Disavow Issues
Hi We have a client who was hit by Penguin about 18 months ago. We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings. The client is asking me whether it would be better to dump the domain and move the website to a fresh domain. Can you provide thoughts / experience on this please? Thanks.
Technical SEO | | EffectiveSEOUK0 -
Persistent Unnatural Links in Webmaster tools
We recently were notified about unnatural links from two websites (totalling a few thousands links each). We went to the websites and asked them to remove the links, which they apparently did. After this we applied for reconsideration to Google, explaining the situation, however they came back and said we still have links. We noticed there were still links, however there were less than before, and so we once again asked the sites to remove all the links. Now we are sure all the links are gone as when we click a random link and view the page source there is no reference to our site, however WebMaster tools is not updating the link list, claiming we still have thousands of links. Do we have to apply for another reconsideration request to get them to re-crawl the sites to get rid of the links, or should it happen automatically?
Technical SEO | | eXia0