404's in WMT are old pages and referrer links no longer linking to them.
-
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them.
I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible?
But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
-
How long ago did you switch platforms? It can take months for Google to come back around to a page that linked to your site. Page on your site will stay in the cache until a few passes.
When you switch, did you do any 301 redirects? Examine the back links to your domain - any that come from good pages should be redirected to the new URL. If not, they will be scooped up by active SEOs. (finding 404 links is a popular link building technique).
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633
If you know the links will be dead forever, try using a 410 response as it is supposed to make search engines drop the page faster.
http://www.seroundtable.com/404-410-google-15225.html (bottom)
Have you requested Google remove old directories/pages? If the content is gone and has no back links, try a removal request.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Having a similar problem with a new site that was created by copying an old site in its entirety. Went through the trouble of cleaning everything up, having pages that were no longer relevant removed, fixed the sitemaps, etc. and now months later WMT showed me a spike of 404s for the old pages with the referrers as the XML sitemap and sitemap page... but they are definitely not be linked from there. I'm assuming there was some sort of hiccup with Google using an older, cached version of the sitemap to find these links.
I wound up just clearing the errors out of WMT and waiting to see if it will recrawl the error pages again. If Google continues to crawl them even though they aren't being linked to, then our next course of action was going to be 301ing them all just in case.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam URL'S in search results
We built a new website for a client. When I do 'site:clientswebsite.com' in Google it shows some of the real, recently submitted pages. But it also shows many pages of spam url results, like this 'clientswebsite.com/gockumamaso/22753.htm' - all of which then go to the sites 404 page. They have page titles and meta descriptions in Chinese or Japanese too. Some of the urls are of real pages, and link to the correct page, despite having the same Chinese page titles and descriptions in the SERPS. When I went to remove all the spammy urls in Search Console (it only allowed me to temporarily hide them), a whole load of new ones popped up in the SERPS after a day or two. The site files itself are all fine, with no errors in the server logs. All the usual stuff...robots.txt, sitemap etc seems ok and the proper pages have all been requested for indexing and are slowly appearing. The spammy ones continue though. What is going on and how can I fix it?
Technical SEO | | Digital-Murph0 -
Responsive design blowing out on page links?
Hi guys, We have a site about to launch with a responsive design to suit desktop, tablet and mobile. Each design carries it's own navigation as one primary set of navigational links won't suit all versions of the design. So as desktop resizes to mobile (or tablet) the desktop navigation is hidden (css) and replaced by the nav more suited to the mobile experience (hope I've explained that ok!). Problem is that if the primary navigation carries 50 links then all 3 designs together carry 150 links which is too many on page links. Is that going to be a problem? Is there a tag that can be applied to the mobile/tablet nav links? Like a canonical tag but for links?? Cheers in advance of your ideas
Technical SEO | | lovealbatross
James0 -
New Website, New URL, New Content - What do we do with the old site? Are 301's the only option?
We've just built a new site for a client. They were adamant on changing the url. The new site is entirely new content, however the subject mater is the same. Some pages are even titled very similarly. Is is advisable to keep the old site running, and link it to the new site? Permanently, or temporarily? Do we simply place redirects from the old site the new? Old site was 30 pages, new site is 80 pages. So redirects won't be available to all the new pages. It seems a shame to trash the old site, it is getting some good traffic, and the content - although outdated is unique and of a high quality. Old url is 4+ yrs old, the new url is new. Some enlightened opinions would be greatly welcomed. Thanks
Technical SEO | | MarketsOnline0 -
Creating in-text links with ' 'target=_blank' - helping/hurting SEO!?!
Good Morning Mozzers, I have a question regarding a new linking strategy I'm trying to implement at my organization. We publish 'digital news magazines' that oftentimes have in-text links that point to external sites. More recently, the editorial department and me (SEO) conferred on some ways to reduce our bounce rate and increase time on page. One of the suggestions I offered is to add the 'target=_blank" attribute to all the links so that site visitors don't necessarily have to leave the site in order to view the link. It has, however, come to my attention that this can have some very negative effects on my SEO program, most notably, (fake or inaccurate) time(s) on-page. Is this an advisable way to create in-text links? Are there any other negative effects that I can expect from implementing such a strategy?
Technical SEO | | NiallSmith0 -
How to find and fix 404 and broken links?
Hi, My campaign is showing me many 404 problems and other tools are also showing me broken links, but the links they show me dose work and I cant seem to find the broken links or the cause of the 404. Can you help?
Technical SEO | | Joseph-Green-SEO0 -
Too Many On Page Links Error On Wordpress Blog
I have a wordpress blog. I am getting an error message from SEOmoz "too many on page links" However SEOmoz is counting a full month of blogs as one page. For example-3 onpage internal links in each blog times 30 different blog article in a month is recorded as 90 on page links. Is there any mechanism to fix this on wordpress
Technical SEO | | wianno1680 -
ECommerce Site, URL's, Canonical and Tracking Referral Traffic
I'm very, very new to eCommerce websites that employ many different URL's to track referral traffic. I have a client that has 18 different URL's that land on the Home Page in order to track traffic from different referral sources. For example: http://erasedisease.com/?ref=abot - Tracks traffic from an affiliate source http://erasedisease.com/?ref=FB01 - Tracks traffic from a FB Ad http://erasedisease.com/?ref=sas&SSAID=289169 - Tracks more affiliate traffic ...and the list goes on and on. My first question is do you think this could hinder our Google rankings? SEOMoz Crawl doesn't show any Duplicate Content Errors, so I guess that's good. I've just been reading a lot about Canonical Url's and eCommerce sites, but I'm not sure if this is a situation where I'd want to use some kind of canonical plugin for this Wordpress website or not. Any advice would be greatly appreciated. Thanks so much!!
Technical SEO | | Linwright0 -
Optimum Number of Links on Any Given Page
One of the guidelines you provide stipulates: "You should avoid having too many (roughly defined as more than 100) hyperlinks on any given page. When search engine spiders crawl the Internet they are limited by technology resources and are only able to crawl a certain number of links per webpage. In addition, search engine algorithms divide the value of some popularity metrics by the amount of links on a given page. This means that each of the pages being linked to from a given page are also affected by the number of links on the linking page. For these reasons, we recommend you include less than 100 links per page to ensure that they are all crawled, though if your pages have a high page authority, search engines will usually follow more links." As far as these 100 links are concerned, is this in reference to ALL links including outbound, internal, etc? Or is this referring to only outbound links to other sites?
Technical SEO | | johncmmc0