301's - Do we keep the old sitemap to assist google with this ?
-
Hello Mozzers,
We have restructured our site and have done many 301 redirects to our new url structure.
I have seen one of my competitors have done similar but they have kept the old sitemap to assist google I guess with their 301's as well.
At present we only have our new site map active but am I missing a trick by not have the old one there as well to assist google with 301's.
thanks
Pete
-
Awesome , Many thanks All !!.
Much Appreciated
Pete
-
Thanks Donna!
-
To reinforce what Dirk has stated, I'm sharing a similar question that was asked a month ago - "Sitemaps during a migration - which is the best way of dealing with them?" The question was answered by CleverPhD. I like how he talks about how Google interprets the change, impacts you might see in Google's Search Console, and what to do about it.
It's worth the read.
-
You shouldn't keep the old sitemap. If the pages are in the index - Google will figure it out the next time when the bot is visiting the site. Make sure that you update all the internal links (avoid internal redirects) - Screaming Frog will do miracles here.
If you would keep the old one you will get warnings like this:
"When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL."
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avoiding duplicate content in manufacturer's [of single product] website
Hello, So I have read a lot of articles about duplicate content/ keyword canibalism/ competing with yourself, and so on. But none of these articles really fit to manufacturer website who produces one product. For example, lets say I make ceramic tiles, this means: Homepage: "Our tiles are the best tiles, we have numerous designs of tiles. We make them only from natural ceramic" Product list: "Here is a list of our tiles: Poesia tile, white tile, textured tile, etc" Page for each tile: Gallery: a bunch of images trying to prove that these tiles look best 🙂 Where to buy page: a map From what I understand this page is already doomed - it will not go well against larger retailers who don't focus only on tiles but they sell everything. This page is set to have a lot of duplicate content. But I hope I am wrong, can someone please make some suggestions how to do SEO on such a website where all pages are about the same thing? Any help would be much appreciated! Juris
Intermediate & Advanced SEO | | JurisBBB0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly. We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text. Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface. What is the best way to execute this?
Intermediate & Advanced SEO | | tatermarketing0 -
301 redirect a old site that has been "dead" for a while?
Hi guys, A quick question. I have a client who has an old business website that had some great links (Forbes.com, CocaCola.com, etc). The problem is that he knew nothing about SEO and let the hosting expire. He still owns the domain, but the site is no longer listed in Google. He did no SEO, so I am not worried about being hit by any artificial anchor text penalties, since the links are as natural as it gets. So my questions is, would there be any benefit from 301 redirecting that site to his new business? The new business is in almost exactly the same niche as the old site. I am thinking of 301'ing to a sub-page which will refer to his past venture with the old business, not to the homepage of the new site. Thanks in advance for your help.
Intermediate & Advanced SEO | | rayvensoft0 -
Should i continue linking the old page after a redirect 301?
hi everybody, i'm trying to understand why my page http://poltronafraubrescia.zenucchi.it isn't ranked for the keyword poltrona frau brescia even if is indexed on google. the page gets authority and traffic from the page www.zenucchi.it/ITA/poltrona-frau-brescia.html that is already ranked in fourth position for that keyword. should i continue linking the previous or the new page to increase the page authority? i don't know if is a good way to link a page not ranked despite a ranked one.. Please help me.... thanks a lot
Intermediate & Advanced SEO | | guidoboem0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
301 Redirect All Url's - WWW -> HTTP
Hi guys, This is part 2 of a question I asked before which got partially answered; I clicked question answered before I realized it only fixed part of the problem so I think I have to post a new question now. I have an apache server I believe on Host Gator. What I want to do is redirect every URL to it's corresponding alternative (www redirects to http). So for example if someone typed in www.mysite.com/page1 it would take them to http://mysite.com/page1 Here is a code that has made all of my site's links go from WWW to HTTP which is great, but the problem is still if you try to access the WWW version by typing it, it still works and I need it to redirect. It's important because Google has been indexing SOME of the URL's as http and some as WWW and my site was just HTTP for a long time until I made the mistake of switching it now I'm having a problem with duplicate content and such. Updated it in Webmaster Tools but I need to do this regardless for other SE's. Thanks a ton! RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC] RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Intermediate & Advanced SEO | | DustinX0