Will rel canonical tags remove previously indexed URLs?
-
Hello,
7 days ago, we implemented canonical tags to resolve duplicate content issues that had been caused by URL parameters. These "duplicate content" had already been indexed.
Now that the URLs have rel canonical tags in place, will Google automatically remove from its index the other URLs with the URL parameters?
I ask because we have been tracking the approximate number of URLs indexed by doing a site: search in Google, and we have barely noticed a decrease in URLs indexed.
Thanks.
-
Thanks.
I think I will monitor for the next 2-3 weeks, and if there still is a lot of unwanted URLS with parameters in the index, I will start requesting removals.
-
You have two options here:
Let Google sort it out (which they will -- but it may take time)
Remove the unnecessary URLs yourself via Webmaster Tool's URL removal tool.
-
Hi Andrea,
yep - we did that.
7 days ago, we implemented the canonical tags because URLs such aswww.example.com/widget?color=blue
www.example.com/widget?size=largewere being indexed, along with the 'real' URL
We resubmitted the sitemap (which has all the 'real' URLs) as well.
At this time, many URLs with parameters are still indexed. I guess after reading this article:
http://www.seomoz.org/blog/catastrophic-canonicalization
I was expecting the change to happen a little quicker...
I just want to confirm no other action is needed on our part.
I understand canonical tags would tell the crawlers which page to index when it finds them for the first time, but I also wanted to confirm that if all URLs are already indexed (because, at the time, no canonical tags were present) implementing the tags would be enough to have the unwanted URLs removed automatically from the index. -
A week isn't very long. It can take Google months to recrawl and drop URLs from an index. Google will figure it out, you just need to give it time. If you haven't done so, update your sitemap to include the tagged pages and resubmit via Google. That will signal them to recrawl your site and could speed up the process.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect Url Within a Canonical Tag
So this might sounds like a silly question... A client of mine has a duplicate content issue which will be fixed using canonical tags. We are also providing them with an updated URL structure meaning rwe will be having to do lots of 301 redirects. The URL structure is a much larger task that than the duplicate content so i planned to set up the canonicals first. Then it occurred to me id be updating the canonical tags with the urls from the old structure which brings me to my question. Will the canonical tags with the old urls redirect credit to the new urls with the 301? Or should i just wait until we have the new url structure in place and use these new urls in the canonicals? Thanks!
Technical SEO | | NickG-1230 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
GWT Duplicate Content and Canonical Tag - Annoying
Hello everyone! I run an e-commerce site and I had some problems with duplicate meta descriptions for product pages. I implemented the rel=canonical in order to address this problem, but after more than a week the number of errors showing in google webmaster tools hasn't changed and the site has been crawled already three times since I put the rel canonical. I didn't change any description as each error regards a set of pages that are identical, same products, same descriptions just different length/colour. I am pretty sure the rel=canonical has been implemented correctly so I can't understand why I still have these errors coming up. Any suggestions? Cheers
Technical SEO | | PremioOscar0 -
What is the best way to use canonical tag
Hi, i have been researching this since yesterday and have looked at this subject many times before but still cannot get my head around it. i done a report on my site which was very useful, i used http://www.juxseo.com for my site www.in2town.co.uk and it brought me some useful information but part of that info was it was telling me that i should have on my home page a canonical tag which would improve my seo. Now i am using sh404sef for my friendly urls and i am using joomla 3.0 and when i approached the makers of the sh404sef to ask about the tag they said i would need to be careful of using it as it could damage my site and my rankings. i have read lots of information but still do not have a clear understanding behind it. can anyone please explain the best way to use this and should i be using where i may have some sort of duplicate page, any help to understand this would be great.
Technical SEO | | ClaireH-1848860 -
Rel Author Tag - Attributing Content to a Company
If the content is written by a corporation and cannot be attributed to a specific writer, yet we want to claim the authorship - can a logo be used instead of a headshot in the Google Profile? Researching the above, and found that its a NO, but it can be done somewhat through rel=publisher tag, which at present is not very useful. Any other thoughts?
Technical SEO | | yanaiguana1111 -
I need to know more clearance on rel=canonical usage than 301 redirects ?
Hi all SEOmozs, As we all know purposes of rel=canonical , I have a query to ask that If we don't have any possibility to use 301 redirects on a domain , can it be really right to use rel=canonical on an old domain to let search engine to treat those all pages should be not priority where the domain we are being promoted in the market to list up instead that. I found this interesting Matt Cutts video http://www.youtube.com/watch?v=gJK5Uloy76g where he has told or cleared the point very nicely, yes we can use it if there is no possibility in your older domain or pages. So here i am asking the same to know more detailed clarity on this so that i can be more confidence on it. I have been seeing issues in my domains where old one domain comes than new domain why with new domain contents, and can it be really very good to bring new domain with **rel=canonical without using 301 redirect :
Technical SEO | | Futura
Old : kanin.com (leaving) New : kangarokanin.com (promoting) Where i might have not used yet the rel=canonical in old domain, will be going to use it soon , after finishing this discussion.** Regards,
Teginder Ravi tcSnN.jpg tcSnN.jpg dGd34.jpg0 -
Will using http ping, lastmod increase our indexation with Google?
If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt? Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots? Is there any benefit to listing the index sitemaps in robots if we’re pinging? If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod> Do we need to all to cover our bases? thanks Marika
Technical SEO | | marika-1786190 -
REL Canonical Error
In my crawl diagnostics it showing a Rel=Canonical error on almost every page. I'm using wordpress. Is there a default wordpress problem that would cause this?
Technical SEO | | mmaes0