Using canonical for duplicate contents outside of my domain
-
I have 2 domains for the same company, example.com and example.sg
Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page.
Any idea if this is the right decision
Thanks
-
Unfortunately, that's a lot more tricky. If you're trying to rank both the .com and .sg version for, let's say, US residents, and those sites have duplicate content, then you do run the risk of Google filtering one of them out. If you use canonical tags or something like that, then one site will be taken out of contention for ranking - in that case, you won't rank for both sites on the same term. The only way to have your cake and eat it too is to make the sites as unique as possible.
Even then, you're potentially going to duplicate effort and cannibalize your own rankings, so it's a risky proposition. In some cases, it may be better to try to promote your social profiles and other pages outside of your site that have some authority. It doesn't have to be your own site ranking, just a site that's generally positive or neutral.
-
Thanks Peter you answer has enrich the discussion
I think your suggestion is the proper way for different local domains versions of the same company or blog
My case is little different that actually lately i am trying to rank both of them in the seek of reputation management
It wasn't intended to be like that on the beginning but now we are trying to take advantage of our other local domain like .sg , .ch and .ae
-
Do you want the .sg site to only rank regionally in Singapore? You could use rel=alternate hreflang to designate the language/region for the two sites, and help Google more accurately know when to display which sites. This also acts as a soft canonicalization signal and tells Google that the pages are known duplicates:
-
Here's an article about rel=canonical where Dr. Pete answers some rel=canonical questions. With regards to rel=canonical passing PageRank he says:
"This is very difficult to measure, but if you use rel=canonical appropriately, and if Google honors it, then it appears to act similarly to a 301-redirect. We suspect it passes authority/PageRank for links to the non-canonical URL, with some small amount of loss (similar to a 301)."
http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
At the end of the following Matt Cutts video (2:10), he says that there isn't a lot of difference between the page rank passing via rel=canonical and page rank passing a 301 redirect.
http://www.youtube.com/watch?v=zW5UL3lzBOA
When it comes to the content of the page, yes, the two versions of the page should be pretty close to identical. I've seen Google refer to it as "highly similar". Here's what Google says:
"A large portion of the duplicate page’s content should be present on the canonical version. One test is to imagine you don’t understand the language of the content—if you placed the duplicate side-by-side with the canonical, does a very large percentage of the words of the duplicate page appear on the canonical page? If you need to speak the language to understand that the pages are similar; for example, if they’re only topically similar but not extremely close in exact words, the canonical designation might be disregarded by search engines."
See: http://googlewebmastercentral.blogspot.co.uk/2013/04/5-common-mistakes-with-relcanonical.html
So, if your pages are too dissimilar then Google may ignore the rel-canonical "suggestion" and the "wrong page" or both pages may appear in Google's index.
-
i think this is useful resource that answer a lot of questions around canonical
-
Thanks Doug for your useful response
Just i need to clarify your sentence
"Be aware that the value of any inbound links to that article will be allocated to the canonical version. "
Do you mean canonical link is passing the page rank similar to 301 Redirect?
What if the 2 pages wasnt 100% identical ?
-
Check this Video Out : http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls
-
Yes, this sounds absolutely correct.
You can check it's working by doing a search for some unique content in your article or using the query with the article's title:
site:{domain} "title"
If everything is working correctly you should only see the canonical version of the article in Google's index. (you can also use the inurl: to check too.
Be aware that the value of any inbound links to that article will be allocated to the canonical version. (This doesn't apply to social follows/likes though.) So think carefully about the audience for the article before deciding which version is canonical.
It may not apply in your case, but it can be a good idea to think about your readers too. By adding a link in the article to the other site, you can help to cross-promote them. You may find tat if some of your visitors find your cross posted article relevant and useful to them they may be more interested in other article on the source site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Can you use no-index to counter duplicate content across separate domains?
Hi Moz Community, I have a client who is splitting out a sub brand from a company website to its own domain. They have lots of content around the theme and they want to migrate most of the content out to the new domain, but they also wanted to keep that content on the main site as the main site gets lots of traffic. My question is, as they want search traffic to go to the new site, but want to keep the best content on the original site too, so it can be found in the nav, if they no-index identical content on main site and index content on the new site will they still be penalised for duplicate content? Our advice has been to keep the thematic content on both sites but make them different enough so they are not considered duplicate - we routinely write the same blog post in 50 different ways for them but their Head of Web asked if the no-index is a route, which means they don't need to pay for and wait for brand new content? They are comfortable in losing traffic until the new domain gets traction. In theory, if they are telling Google not to index or rank the main site content, the new site shouldn't be penalised but I'm not confident giving that advice as I've never been asked to do this before. Thoughts?
Technical SEO | | Algorhythm_jT0 -
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0