Duplicate content pages on different domains, best practice?
-
Hi,
We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity.
So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none.
Thanks for taking the time in reading this.
-
Using canonical IS NOT the solution, because if you use canonical the FAQ pages of the canonicalized websites are going to be deindexed.
So, just do it if you really don't care about the traffic those answers can generate for your sites (as you can imagine, this is an ironic suggestion...).
Just use the hreflang, because Google in the last months has become quite smart in understanding that it means that you consider those pages relevant enough for the geo-targeted audiences to not filter them out even if they are substantially identical between country versions.
Said that, try to differentiate the FAQ pages (better localization of the language. i.e.: English UK is slightly different than American English), or even offering a local number for phone inquiries and localized email address for question via email.
In general, it is not a good idea using the crossdomain canonical in International SEO, and it should be used just in exceptional cases.
-
In order to make things easier you can implement hreflang via sitemaps.xml using this tool by Mediaflow: http://www.themediaflow.com/tool_hreflang.php.
-
If your site is based on templates so you can easily put in a header code (Wordpress, Joomla, most CMS, anything but a page-by-page HTML usually) you can insert it directly in by calling in the page itself like this:
" hreflang="x-default" />
" hreflang="en-au" />
" hreflang="en-us" />
" hreflang="en-nz" />This works on Apache servers - this starts with the domain and then request_URI pulls in the page you're on so /about, or /faq and adds the appropriate hreflang tag to that.
Also, when you're done implementing hreflang test it using Flang.
-
As the other users have pointed out, the alternate and hreflang tag would be most ideal. I am in a pickle myself with a very similar issue.
You must note that the alternate tag is to be applied on a page level so every page should resolve to the appropriate URL of it's copy on all other country domains.
So your homepage (.com) could have the following alternate tags:
But on your FAQ page, the alternates would be:
You'll have to rinse and repeat on all 3 sites and for every single page.
Tedious if you ask me! Does anyone know an easier way to go around adding alternate tags to 3 or 4 sites without doing it manually?
The advantage of implementing those however is that you are not canonicalising to one domain which means all your domains stand a chance of performing well in their regions (e.g a search on Google Australia will show the .com.au website).
Again, does anyone have a better approach to this or seen / heard of one? Apart from canonical of course.
-
Hreflang tags are great. I would highly suggest implementing these. Something that I was confused about when I first started using them was that all tags should be on all domains including its own.
For example: firstcountry.com/faq.html should have tags for:
and so on.
You can check that these have been implemented correctly in Google Webmaster Tools under "Search Traffic" -> "International Targeting"
-
I would start by implementing hreflang tags:
https://support.google.com/webmasters/answer/189077?hl=en
Hreflang should take care of these type of issues as Google will associate the right country domain with the content. You may see some overlap for awhile - we've seen hreflang take a bit longer than we'd like to get fully set but once it is, it usually works well.
Short of that, you have 3 options. 1) change the content on all sites to be (somewhat) unique. 2) deindex all but one as you said, 3) canonical, as you said.
1, 2 & 3 all have problems so that's why I would start with hreflang.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's best practice for cart pages?
i don't mean e-commerce site in general, but the actual cart page itself. What's best practice for the links that customers click to add products to the cart, and the cart page itself? Also, I use vanity URLs for my cart links which redirect to the actual cart page with the parameters applied. Should I use use 301 or 302 redirects for the links? Do I make the cart page's canonical tag point back to the store home page so that I'm not accruing link juice to a page that customers don't actually want to land on from search? I'm kinda surprised at the dearth of information out there on this, or maybe I'm not looking in the right places?
Technical SEO | | VM-Oz0 -
Duplicate content on Places to Stay listings pages
Hello, I've just crawled our website https://www.i-escape.com/ to find we have a duplicate content issue. Every places to stay listing page has identical content (over 1,500 places) due to the fact it's based on user searches or selections. If we hide this pages using canonical tags, will we lose our visibility for each country and/or region we promote hotels? Any help on this would be hugely appreciated! Thanks so much Clair
Technical SEO | | iescape0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Do multipe empty search result pages count as duplicate content?
I am writing an online application that among other things allows the users to search through our database for results. Pretty simply stuff. My question is this. When the site is starting out, there will probably be a lot of searches that will bring back empty pages since we will still be building it up. Each page will dynamically generate the title tags, description tags, H1, H2, H3 tags - so that part will be unique - but otherwise they will be almost identical empty results pages until then. Would Google Count all these empty result pages as duplicate content? Anybody have any experience with this? Thanks in advance.
Technical SEO | | rayvensoft0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Duplicate content, Original source?
Hi there, say i have two websites with identicle content. website a had content on before website b - so will be seen as the original source? If the content was intended for website b, would taking it off a then make the orinal source to google then go to website b? I want website b to get the value of the content but it was put on website a first - would taking it off website a then give website b the full power of the content? Any help of advice much appreciated. Kind Regards,
Technical SEO | | pauledwards0 -
Duplicate Content
Hello All, my first web crawl has come back with a duplicate content warning for www.simodal.com and www.simodal.com/index.htm slightly mystified! thanks paul
Technical SEO | | simodal0