Worldwide Stores - Duplicate Content Question
-
Hello,
We recently added new store views for our primary domain for different countries.
Our primary url: www.store.com
Different Countries URLS:
And so forth and so on. This resulted in an almost immediate rankings drop for several keywords which we feel is a result of duplicate content creation. We've thousands of pages on our primary site.
We've assigned a "no follow" tags to all store views for now, and trying to roll back the changes we did.
However, we've seen some stores launching in different countries with same content, but with a country specific extensions like .co.uk, .co.nz., .com.au.
At this point, it appears we have three choices:
1. Remove/Change duplicate content in country specific urls/store views.
2. Launch using .co.uk, .com.au with duplicate content for now.
3. Launch using .co.uk, .com.au etc with fresh content for all stores.
Please keep in mind, option 1, and 3 can get very expensive keeping hundreds of products in untested territories. Ideally, we would like test first and then scale.
However, we'd like to avoid any duplicate penalties on our main domain. Thanks for your help and answers on the same.
-
Hi!
From what I many have understood, the duplication content problem is probably affecting your site because the main shop is in English (www.store.com) and the /au/, /uk/ and /nz/ too are in English and with exact content you use for the main shop.
When you are targeting different countries where is spoken the same language, as it is in this case, it is really suggested to follow these practices:
- try to "localize" the most you can the different countries shops, using local currencies, addresses, timezones;
- try to use the local variation of the language, as - for instance - the English spoken in UK is quite different from the one spoken in the USA. I know that this not really an option right now, because of the costs and time needed, but I urge to plan it in long term;
- use the tag rel="alternate" hreflang. This tag has been especially created for these situations, in order to tell to Google that a page - i.e.: www.store.com/uk/ - should be shown to the users from Uk. More and very easy to follow instructions about how to implement it can be found here
Remember that you can use this rel also at a level page, so you can maintain the subcarpet organization of your store.
This especially useful in order to have just one site to think when it comes to link building. And the different link building campaigns you may start for any targeted country will benefit the site as a whole.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content hidden behind tabs
Just looking at an ecommerce website and they've hidden their product page's duplicate content behind tabs on the product pages - not on purpose, I might add. Is this a legitimate way to hide duplicate content, now that Google has lowered the importance and crawlability of content hidden behind tabs? Is this a legitimate tactic to tackle duplicate content? Your thoughts would be welcome. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Will we be penalised for duplicate content on a sub-domain?
Hi there, I run a WordPress blog and I use [community platform] Discourse for commenting. When we publish a post to Wordpress, a duplicate of that post is pushed to a topic on Discourse, which is on a sub-domain. Eg: The original post and the duplicated post. Will we be penalised for duplicating our own content on a subdomain? If so, other than using an excerpt, what are our options? Thanks!
Intermediate & Advanced SEO | | ILOVETHEHAWK0 -
Will merging sites create a duplicate content penalty?
I have 2 sites that would be better suited being merged and creating a more authoritative site. Basically I'de like to merge site A in to site B. If I add new pages from site A to Site B and create 301 redirects for those pages on site A to the new pages on Site B is that the best way to go about it? As the pages are already indexed would this create any duplicate content issue or would the redirect solve this?
Intermediate & Advanced SEO | | boballanjones0 -
Duplicate Content: Organic vs Local SEO
Does Google treat them differently? I found something interesting just now and decided to post it up http://www.daviddischler.com/is-duplicate-content-treated-differently-when-local-seo-comes-into-play/
Intermediate & Advanced SEO | | daviddischler0 -
Will duplicate content across a .net website and a .ch have negative affects on SERPs?
Hi, I am working with a company that has a .net site and a .ch website that are identical. Will this duplicate content have a negative affect on SERPs? Thanks Ali.B
Intermediate & Advanced SEO | | Bmeisterali0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0