International Website Targeting
-
Hello fellow Mozzers, had a quick question.
So we have a new eCommerce client that is interested in launching a website in multiple countries. According to their vision, they want a US site, UK site, Japan site, etc and so on.
I have a few concerns about doing it this way.
First, there is the issue with the sites being the same. They only difference will be that they have a different domain, such as domain.co.jp for the Japan-based site, domain.co.uk for UK, etc.
Even if we target different countries in webmaster, won't the sites still compete with one another and potentially get tagged as duplicates?
I'm thinking there has to be a better way to have a site targeted at the world, without having to clone and duplicate and relaunch. Anyone have experience with this?
-
Thank you for your response. It appears the best way to go about this is to make the main site amazing and optimize it around what they sell, correct? This is what I had in mind, but just wanted to check with the community to be sure.
-
Hi David.
You are right... creating three identical sites on three different geo-targeted domain names can be a problem.
However, if you implement the hreflang tags in order to tell Google what URL to show depending of the geography of the targeted user (see more here: https://support.google.com/webmasters/answer/189077?hl=en) will avoid the risk of duplicated content and to see the stronger domain outranking the one meant, for instance, for Japan in Japan.
Said that, this "full duplication" strategy can be only very temporary, because it does not have much sense targeting users living in different countries with the exact matching content:
- American English is different than British English;
- In Japan people simply don't search in English, and maybe neither use Google as main search engine (it is Yahoo).
- The culture, hence how people use internet and search on the web, is different in the three countries (slightly different in the case of USA and UK, enormously in the case of Japan).
So... yes, the strategy proposed is not the most effective one, despite of the advantages of the hreflang mark up implementation.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do content copycats (plagiarism) hurt original website rankings?
Hi all, Found some websites stolen our content and using the same sentences in their website pages. Does this content hurt our website rankings? Their DA is low, still we are worried about the damage about this plagiarism. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Hosting Multiple Websites Within The Same Server Space
Hi, So, I have a client who wants to host two websites (which you could refer to as sister sites) on the same hosting account. For some reason, I was under the impression that doing as much may be detrimental (for SEO purposes). Am I correct in thinking this? Can I get some back-up documentation or comments here? I look forward to hearing what you all have to say. Thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Is it a good idea to target a similar versions of a keyword?
Salute you all, I am optimizing a site for an attorney. I have done some good research and find the keyword difficulties. Some of my keywords are very similar was wondering is this a good idea and safe (white hat) or not? e.g. page title: 1) city immigration lawyer 2) city immigration attorney My main and first reason is to target all users. Since some will search under 'attorney' and some under 'Lawyer'. Secondly one is easier than the other. I appreciate any input from more experienced seo experts. Chris 🙂
White Hat / Black Hat SEO | | Chris-tx0 -
Should You Link Back from Client's Website?
We had a discussion in the office today, about if it can help or hurt you to link back to your site from one that you optimize, host, or manage. A few ideas that were mentioned: HURT:
White Hat / Black Hat SEO | | David-Kley
1. The website is not directly related to your niche, therefore Google will treat it as a link exchange or spammy link.
2. Links back to you are often not surrounded by related text about your services, and looks out of place to users and Search Engines. HELP:
1. On good (higher PR, reputable domain) domains, a link back can add authority, even if the site is not directly related to your services.
2. Allows high ranking sites to show users who the provider is, potentially creating a new client, and a followed incoming link on anchor text you can choose. So, what do you think? Test results would be appreciated, as we are trying to get real data. Benefits and cons if you have an opinion.2 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Geo-targeted Organic Search Traffic to a sub-domain
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country. Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market. We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions: a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country? b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective? Thanks.
White Hat / Black Hat SEO | | ontarget-media1