Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large traffic loss, how to resolve?
So I have a few questions. A site which has dropped quite dramatically over the last month. There could be a few factors (less regular updates) been one. However, plenty of links spam has happened which some of which appears to be been pointed at an article. That article has been removed. We also disavowed 800+ links all of which had a spam score of between 100 and 40+ - they all looked ugly and irreverent such as forum spams, wallpapers spams, and junk. Is this the right move to make? What should the site do next? The current score is 4,000 SEMR. It was around 15,000 before. It's an aged site. Less effort is put into it now as the next site is here. Await your replys thanks. 3BCQNi3
White Hat / Black Hat SEO | | thewebpreneur0 -
International Website Targeting
Hello fellow Mozzers, had a quick question. So we have a new eCommerce client that is interested in launching a website in multiple countries. According to their vision, they want a US site, UK site, Japan site, etc and so on. I have a few concerns about doing it this way. First, there is the issue with the sites being the same. They only difference will be that they have a different domain, such as domain.co.jp for the Japan-based site, domain.co.uk for UK, etc. Even if we target different countries in webmaster, won't the sites still compete with one another and potentially get tagged as duplicates? I'm thinking there has to be a better way to have a site targeted at the world, without having to clone and duplicate and relaunch. Anyone have experience with this?
White Hat / Black Hat SEO | | David-Kley0 -
Keyword + Location domains
Hi All, Just wanted to get everyones opinions on this, I see it more and more now where businesses own multiple domains for [keyword] + [location], they have multiple domains for different locations and setup individual sites on them. I see these types of domains rank very easily for medium competition keywords, as long as the on page is good and there are a handful of back links, they rank. just to clarify, for example - iphonerepairmanchester.co.uk (purely an example not sure how this site ranks!!) What are Googles views on this? I've always insisted its better to build a strong brand with the "real" business rather than creating extra websites named by keywords. But I've recently had a client want to pursue this and it seems it currently works, but is there a danger down the line Google will penalise it? The short term traffic increase is undeniable but like anything in the world of Google at the moment, I'd rather persuade clients not to go this route if it will protect future interests.
White Hat / Black Hat SEO | | gamnaking10 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Domain authority - Low quality links
I have a question I hope people can help me on. it is my intention for my next project to focus on domain authority, and a small number of high quality links. I have a couple of scenarios I would appreciate some advice on: 1. Can lower quality links lower domain authority? 2. Would you avoid links from low quality sites no matter what \ what domain authority levels should you avoid links from. 3. Should I be looking at link profiles of the sites I get links from. Does it matter if a site I get a link from has 1000's of spammy links (i.e. something to look out for when doing guest blogging). 4. Should I avoid directories no matter what, or is high pr \ domain authority directories ok to use, if I end up on a page of other relevant directory submissions related to my niche. Essentially, my aim is to have high quality links, but equally, there are some decent sites on the fringes that I will need to consider (based on a competitors link profile I researches).
White Hat / Black Hat SEO | | Jonathan19790 -
Has google done well with these search results?
I am struggling to grasp the new logic behind google, my understanding was that they wanted to return more related searches so that the search matched the results giving people exactly what they are looking for from trusted suppliers. However I work in the vacation rental niche and I have found that the individual long tail searches have started to become less valuable as they are no longer giving the exact property. Here is a screenshot of the top 10 results for the key phrase "10 bedroom villas in quinta do lago" Position 1 & 2 are good results and would be expected however the next 7 positions are completely not related to the search, yes it is quinta do lago. But I am looking specifically for a 10 bedroom villa, none of these pages offer 10 bedroom villas. I actually found my listing outside the top 20 and mine is a 10 bedroom villa in quinta do lago. Does anyone have anything that can enlighten me on this? Thanks Andy 0bqdRJi
White Hat / Black Hat SEO | | iprosoftware0 -
Redirect n domain to one
What happen when I redirect301 10 domain to one? I have 10 domain with ave Page Authority=45 and Domain Authority 60 and want to increase my new domain by redirect them. is it right or wrong?
White Hat / Black Hat SEO | | vahidafshari450 -
Geotargeting a new domain without impacting traffic to existing domain
I had previously asked this as a 'private question' and couldn't make it a 'public question' automatically-- hence reposting it as a new question: We have an existing site, let's say www.xyz.com --- which attracts traffic from all over the world (including the US), though it's primary audience is the UK/ Europe. Most of this traffic is via organic search results on Google. Now, there is a business case to launch a US-centric website -- www.xyz.us, which will have most of its content from the original site (probably with some localization). Our goal is that on day 1 when the new site xyz.us is launched, we want all traffic originating from the US (and may be some other North American countries) to be directed to the .us domain instead of the .com domain. We don't want to lose any search engine traffic; equally importantly, we want this to be done in a manner that is seen by the search engines as a legitimate technique. What are the best options to do this such that the new .US site automatically inherits all of the traffic from the .com site on day 1, without either of these sites getting penalized in any form. Thanks.
White Hat / Black Hat SEO | | ontarget-media0