International Sites and Duplicate Content
-
Hello,
I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries.
As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling.
Perhaps the UK site is visible in all those versions
http://www.MyDomain.com/en/GB/
http://www.MyDomain.com/it/GB/
http://www.MyDomain.com/fr/GB/
http://www.MyDomain.com/de/GB/
http://www.MyDomain.com/es/GB/
Obviously for SEO only the first version is important
One other example, the French site is available in 5 languages and again...
http://www.MyDomain.com/fr/FR/
http://www.MyDomain.com/en/FR/
http://www.MyDomain.com/it/FR/
http://www.MyDomain.com/de/FR/
http://www.MyDomain.com/es/FR/
And so on...this is creating 3 issues mainly:
-
Endless crawling - with crawlers not focusing on most important pages
-
Duplication of content
-
Wrong GEO urls ranking in Google
I have already implemented href lang but didn't noticed any improvements. Therefore my question is
Should I exclude with "robots.txt" and "no index" the non appropriate targeting?
Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on
What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google
Please comment
-
-
Hey Guido, don't know if it's the best solution, but could be a temporary fix until the best solution is in place. I suggest to move forward with proper HREF LANG tagging or definitely delete those irrelevant languages. Try to do what I said before about validate each country/language and submit a sitemap.xml reflecting that folder to see crawl and index stats pero country/language. Add a sitemap index and obviously validate your entire domain. Just block in the robots.txt unnecessary folders, like images, js libraries, etc. to save crawl budget to your domain.
Let me know if you have another doubt
-
Thank you Antonio, insightful and clear.
There is really not a need of EN versions of localized sites, I think has been done more as was easier to implement (original site is EN-US).
Don't you think robots and noindex EN version of localized sites could be the best solution? for sure is the easier one to implement without affecting UX.
-
Don't know why you have a UK oriented site for German and Italian people, I think is not important those languages in a country mainly English speaking (not US for example, there you must have a Spanish version, or in Canada for English and French). The owner must have their reasons.
Besides this, about your questions:
- If those non-relevant languages must live there, it's correct to implement HREF LANG (may take some time to show results). Also, if the domain is gTLD, you can validate all the subfolders in Google Search Console and choose the proper International targeting. With the ammount of languages and countries I imagine this might be a pain in the ***.
- About the crawling, for large sitesI recommend to crawl per language. If neccesary, per language-country. In this instance I recommend to create a sitemap XML per language or language-country for just HTML pages (hopefully dynamically updated by the e-commerce), create a Sitemap Index in the root of the domain and submit them in Google Search Console (better if you validated the languages or language-country). With this you can answer the question if some language or country are being not indexed with the Submited/Indexed stadistics of GSC.
- Maybe the robots.txt might save your crawl budget, but I'm not a fan of de-index if those folders are truly not relevant (after all, there should be a italian living in UK. If you can't delete the irrelevant langauges for some countries, this can be an option
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switch to Separate URL Configurations for International SEO?
We run an ecommerce website and sell to customers in the US and Canada. We recently realized that the way we serve content to our users isn't Google's recommended way. We use locale-adaptive pages in that some content changes slightly depending on where we think the user is located based on their IP address. But the URL doesn't change. Google's stance on locale-adaptive - https://support.google.com/webmasters/answer/6144055?hl=en&ref_topic=2370587 That being said, the changes are quite minor. It is usually only pricing and currency that changes depending on if we determine the user to be based in Canada or the US. However, I understand that there can be problems related to this configuration because of GoogleBot primarily using a US-based IP. We decided that the best course of action for us is to serve US content as our default content on all of our existing URLs. And Canadian content would be served using new url paths such as: example.com**/en-ca/**product1. All of this would also be configured to use hreflang tags. The problem we have run into is that it is a pretty huge development challenge to reconfigure how the site serves content when we have been using locale-adaptive for over a year. So developer resources would be taken away from other tasks and put toward this one for a relatively long time. Based on this information and the fact that we would like to both rank better in Canada and to follow Google's recommendations, how important would you say this change would be? I realize this isn't a black and white question and it depends a lot on business goals and preferences. However, I need to be able to gauge as best as I can how necessary it is to do this in order to make the decision of whether to do it or not. Any input is greatly appreciated!
International SEO | | westcoastcraig1 -
International Confusion between .com and .com/us
A question regarding International SEO: We are seeing cases for many sites that meet these criteria: -International sites that have www.example.com/ ip redirecting to country site based on ip redirect (ex. www.example.com/ 301 to www.example.com/us -There is a desktop + mobile site (www.example.com + m.example.com) The issue we see is Google shows www.example.com/ in US search results instead of www.example.com/us in search results. Since the .com/ redirects, there is no mobile version, and www.example.com/ also shows up in mobile SERPs instead of m.example.com/us. My questions are: 1. If www.example.com/ is redirecting users and Googlebot, why is Googlebot caching it with the content of www.example.com/us? 2. Why is www.example.com/ showing up in SERPs instead of www.example.com/us? 3. How can we help Google display www.example.com/us and m.example.com/us in SERPs instead of www.example.com/? Thanks!!
International SEO | | FranFerrara0 -
Webmaster tools International Targeting
Hello there, If we have country specific websites such as: usa.domain.com (For the US) uk.domain.com (For the UK) fr.domain.com (For France) Should each of the above sub-domains be set up separately in webmaster tools? With "target users in" for each of the specific countries? Thanks
International SEO | | roberthseo0 -
3 month old site lost almost complete traffic overnight
Hi All, I started a Indian coupon and deal site http://www.couponspy.in/ around 3 month ago and traffic increased almost daily. But yesterday my site lost almost all of its traffic. Keywords which ranked 1-5 lost around 4-15 places and keywords which ranked 6-20 lost ca. 20-50 places. The Moz Crawl Diagnostics doesn't indicate any mayor issues. Has there been a Google Panda update in India? Reasons why my site has been affected? Please help!!!! 😉 I have seen the same traffic decrease on other coupon start ups, eg https://www.cuponation.in/ and https://www.cuponation.in/ Did we all make the same mistake? Any guesses?
International SEO | | ParvatiSingh0 -
How to make Google consider my international subdomain relevant?
We have recently started to look deeper into international SEO. We have search engine optimized our international landing pages, title tags and meta descriptions with keywords etc. so each of the international language we support is SEO'ed for the local market. We support 12 languages, and each of them are located on a subdomain. That means if we say our site is helloworld.com, a person from Germany that lands on this site can switch to German and will then be redirected to de.helloworld.com and all content will be in German. Our problem is that we develop cloud-based software, we have a significant amount of traffic, but whenever we get media coverage or people link to us from anywhere in the world they always link to the root domain which in this case then would be helloworld.com. That means if I go to google.de and type in the exact meta description or title tag we use in German, the Google search engine can't even find us because "I assume" Google don't consider our de.helloworld.com relevant because nobody has ever linked to this site. I would appreciate very much if anyone can give me some advice on how I can address this issue. Thanks a lot! Allan
International SEO | | Todoist0 -
Best domain for spanish language site targeting ALL spanish territories?
hi, we're have a strong .com domain and are looking to launch a site for spanish speakers (ie latin america + spain). we already have various subdirectories for some foreign language sites (eg. ourdomain.co.uk, us.ourdomain.com, ca.ourdomain.com, ourdomainchina.com, ourdomainindia.com etc) we already have a B2B site ourdomain.com-es which will remain the same. I'm thinking best practice would be to launch translated copy for the following: ourdomain.com/es ourdomain.com/cl ourdomain.com/mx ourdomain.com/pt etc etc firstly is this the best option? secondly, i'm really interested to hear whether there is a less time/resource intensive route that would give us visibility in ALL spanish speaking territories? Also - if we go with just one of the above (eg ourdomain.com/cl) how likely are we to get traction in other spanish speaking territories? any help much appreciated!
International SEO | | KevinDunne0 -
Ranking local content against English content
Hi SEOMoz, If I have a site that has multiple language selection, how would it be possible for my site with localized content to rank higher than the default/English content? For example, in Google.co.th, my site with English content (http://www.xyz.com/en-us/) ranks higher than the localized content (http://www.xyz.com/th-th/). How is it possible for me to turn things around and ensure that for local markets like Indonesia, Thailand, Vietnam, China etc the localized content organically ranks higher than the default-English page? Thanks! Steve
International SEO | | sjcbayona-412180 -
What's the best strategy for checking international rankings?
Hi There- I am looking to optimize sites serving the UK and Austrailia markets. I feel like I have a good handle on how to go about doing that, but what I am fuzzy on is, what's the best way to monitor the SERPs for the keywords I am targeting. I know based on experience that if I just search google.com.au from here in the states, my results will be 'americanized' and may/probably won't accurately reflect what someone would see if they were search from Austrailia. Are there any good tools or tactics for seeing what searchers in the countries I am focusing on woudl see? Thanks! Jason
International SEO | | phantom0