Duplicate Content Penalties, International Sites
-
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
-
The consensus is that even though the content is the same, that it will rank locally using country specific domains. Can anyone provide examples where this is currently working?
-
I use rackspace | cloud sites. Is there a way I can request to have a domain pushed to a pool you have in UK or CA for example?
-
This Video from Matt Cutts will help too http://www.youtube.com/watch?v=Ets7nHOV1Yo
-
I asked this exact question to Greg Grothaus from Google at a conference back in 2009, and his answer was that duplicated content across different TLDs should'nt be something to be too concerned about. Realistically, search engines will decide which version of the site is more relevant for a particular geographic audience.
-
When it comes to English... I just advice that there are ways to make "different" a content. Just think to how different Brits and Americans write many words. Then all the classic International SEO tactics (links for the country your site have to rank, IP, address...)
Apart that, if you have the international sites with their corresponding Tld (.co.uk, .au, .in...) and you specify that the .com is for the USA Google, actually Google is quite good in noticing what site should have to rank for any country.
-
Yes. Translated content will not be considered a penalty as long as long as you launch the site on a domain with proper local TLD and add locally targeted content then you should be ok. Additionally, you may want to consider hosting the website with a local hosting provider.
This should also apply to an English language content modified for UK audience since UK English is technically considered different than the US. We have multiple English language international websites hosted on local TLDs that rank locally for the respective keywords.
Google has become much smarter in terms of detecting the geo local elements and it should serve the appropriate site on the SERP without causing duplicate content issues.
-
I think this sort of duplicate content is something that Google sees often. If you are copying everything exactly between domains I’d question if you need multiple sites. Presuming your content has country specific differences you’ll be ok.
Do not forget to register in Google webmaster tools your target market for each URL. Maybe build some new links in each local at the time of launch (press mentions, twitter shout outs etc).
Also you may want to consider the approach taken by Microsoft. One domain with country specific folders e.g.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do quizzes hurt your site? Thin content?
We did a 10 question quiz awhile back relating to something we were sponsoring, and it had a decent response. However, considering quizzes just aren't that long, does that contribute to making the site's content thin? Obviously, it's not a major problem at the moment, but if we did more of them would this be an issue? If there's no real issue, I'd prefer not to no-index them, but I'd love some feedback to help make the decision. Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Worpress Tags Duplicate Content
I just fixed a tags duplicate content issue. I have noindexed the tags. Was wondering if anyone has ever fixed this issue and how long did it take you to recover from it? Just kind of want to know for a piece of mind.
Technical SEO | | deaddogdesign0 -
How to resolve this Duplicate content?
Hi , There is page i get when i do proper menu navigation Caratlane.com>jewellery>rings>casualsrings> http://www.caratlane.com/jewellery/rings/casual-rings/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html When i do a site search in my search box by my product code number "JR00219" The same page is appears with different url http://www.caratlane.com/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html So there is a duplicate content. How can we resolve it. Regards, kathir caratlane.com
Technical SEO | | kathiravan0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Query string in url - duplicate content?
Hi everyone I would appreciate some advice on the following. I have a page which has some nice content on but it also has a search functionality. When a search is run a querystrong is run. So i will get something like mypage.php?id=20 etc. With many different url potentials, will each query string be seen as a different page? If so i don't want duplicate content. So am i best putting canonical tags in the head tags on mypage.php ? to avoid Google seeing potential duplicate content. Many thanks for all your advice.
Technical SEO | | pauledwards0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0