404 page not found after site migration
-
Hi,
A question from our developer.
We have an issue in Google Webmaster Tools.
A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect.
We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising.
I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs.
The old site is no longer in the index. Searching google for site:domain-one.com returns no results.
Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages.
Thanks in advance.
-
I agree that the 301 redirect would be your best option as you can pass along not only users but the bots to the right page.. You may need to get a developer in to write some regular expressions to parse the incoming request and then automatically find the correct new URL. I have worked on sites with a large number of pages and using some sort of automation is the only way to go.
That said, if you simply want to kill the old URLs you can show the 404s or 410s. As you mention, then you end up with a bunch of 404 errors in GWT. I have been there too, it's like damned if you do, damned if you don't. We had some URLs that were tracking URLs from an old site and we are now here a year later (been showing 410s for over a year on the old tracking URLs) they still show up in GWT as errors.
We are trying a new solution for how to remove these URLs from the index without getting 404 errors. We show a 200 and then we put up a minimal html page with the meta robots noindex tag.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
"When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. "
So, we allow Google to find the page, get a 200 (so no 404 errors), but then use the meta noindex tag to tell Google to remove it from the index and stop crawling the page.
Remember, this is the "nuclear" option. You only want to do this to remove the pages from the Google index. Someone mentioned using GWT to remove URLs, but if I remember correctly, you only have so many pages you can do this with at a time.
If you list the files within the robots.txt. Google will not spider the files, but then if you remove the page from robots.txt file, they will start to try spidering again. I have seen Google come back a year later on URLs when I take them out of robots. This is what happened to us and so we tried just showing the 410/404, but Google still keeps crawling. We recently moved to this option with the 200/noindexmeta and it seems to be working.
Good luck!
-
You can but the 404s should stop being crawled on their own. There's a webmaster tool that you can use to make that happen faster as well
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=64033
-
Yeah it's a 404 http://www.tester.co.uk/17th-edition-equipment/multifunction-testers/fluke-1651b-multifunction-installation-tester
with over 200,000 404's its a lot to go through and 301. For some reason they it got migrated they just pointed the old url to a new one replacing the root domain name without creating matching url's. Doh.
I was thinking about robot.txt filling them all?
-
A 404 should cause Google to de-index the content. Go to one of the bad URLs and view the headers to make sure that your webserver is returning a status 404 and not just a 404 "page".
As hard and time consuming as it might be, I would still pursue a 301 option. It's the cleanest way to resolve the issue. Just start nibbling at it and you can make a dent. Doing nothing just lets the problem grow.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Responsive Site has "Not Found" Errors for mobile/ and m/ in Google Search Console
We have recently launched a new responsive website for a client and have noticed 2 "Not Found" errors within Google Search Console for /mobile and /m Both these URLs are not linked from anywhere within the site. However Google is reporting them as being linked from the homepage. This is not the first site we have seen in which Google has reported this error, however the other site was not a mobile friendly site. My thoughts are to 301 them back to the Homepage. Anybody else have any thoughts on this? or have recently received the same errors?
Web Design | | JustinTaylor881 -
Google tag manager on blocked beta site - will it phone home to Google and cause site to get indexed?
We want to develop a beta site, in a directory with the robots.txt blocking bots. We want to include the Google Tag Manager tags and event layer tracking code on this beta site. My question is that by including the Google Tag Manager code, that phones home to Google, will it cause Google to index this beta site when we don't want it indexed?
Web Design | | CFSSEO0 -
Multi-page articles, pagination, best practice...
A couple months ago we mitigated a 12-year-old site -- about 2,000 pages -- to WordPress.
Web Design | | jmueller0823
The transition was smooth (301 redirects), we haven't lost much search juice. We have about 75 multi-page articles (posts); we're using a plugin (Organize Series) to manage the pagination. On the old site, all of the pages in the series had the same title. I've since heard this is not a good SEO practice (duplicate titles). The url's were the same too, with a 'number' (designating the page number) appended to the title text. Here's my question: 1. Is there a best practice for titles & url's of multi-page articles? Let's say we have an article named: 'This is an Article' ... What if I name the pages like this:
-- This is an Article, Page 1
-- This is an Article, Page 2
-- This is an Article, Page 3 Is that a good idea? Or, should each page have a completely different title? Does it matter?
** I think for usability, the examples above are best; they give the reader context. What about url's ? Are these a good idea? /this-is-an-article-01, /this-is-an-article-02, and so on...
Does it matter? 2. I've read that maybe multi-page articles are not such a good idea -- from usability and SEO standpoints. We tend to limit our articles to about 800 words per page. So, is it better to publish 'long' articles instead of multi-page? Does it matter? I think I'm seeing a trend on content sites toward long, one-page articles. 3. Any other gotchas we should be aware of, related to SEO/ multi-page? Long post... we've gone back-and-forth on this a couple times and need to get this settled.
Thanks much! Jim0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0 -
What is wrong with our site?
Hello Seomoz friends. I've about to pull all my hair out and need to turn somewhere. Our site, www.sightmax.com has been around since 2004. We used to be ranked at the top of page two on google under the keyword "live chat". We are no on page 4, heading to page 5. Can anyone take a look at the site and see if anything jumps out at you? The only way we have been able to get on the first page, is the pay for google adwords (which we've been doing every month for 7 years). Again, the site www.sightmax.com and the keyword is "live chat". Any help or feedback would be appreciated SO MUCH! Thanks! Eric
Web Design | | EricWeber0 -
What can this charity site do to improve SEO?
Hello wise ones, We have been working with the charity Volunteers of America to create a new site for their car donation program at carshelpingpeople.org They are a national charity with extensive local programs run by regional affiliates, so the site is divided into a small national section linked to Regional Affiliate sections. You get to the regional sections either by entering your zip code or clicking on your state in the bottom nav of the national pages. Right now we have developed regional sections for Michigan, Nevada, Maryland, Washington D.C., New Jersey, Delaware and the Philadelphia area. The site is about 2 1/2 months old, and while our conversion rate is pretty good (7%) our organic search ranking isn't improving as quickly as we'd like to see. Car donation is a very competitive space, and we would appreciate any advice on how to improve the SEO of the site. Thanks so much.
Web Design | | Phibnax0 -
What is the optimal URL Structure for Internal Pages
Is it more SEO friendly to have an internal page URL structure that reads like www.smithlawfirm.com/personal-injury/car-accidents or www.smithlawfirm.com/personal-injury-car-accidents? The former structure has the benefit of showing Google all the sub-categories under personal injury; the later the benefit of a flatter structure. Thanks
Web Design | | rarbel0