Geo-targeted Organic Search Traffic to a sub-domain
-
For a client of ours, we are likely to create a sub-domain that is to be targeted at a specific country.
Most of the content on this sub-domain will be from the main site, although with some specific differentiation to suit that geographic market.
We intend to tell Google through Webmaster Centre that the sub-domain is targeted at a specific country. Some questions:
a) Any idea how long it could take before google gives precedence to the content in this sub-domain for queries originating from that particular country?
b) What is the likely impact of content duplication ? What extent of differentiation is necessary from a search engine perspective?
Thanks.
-
Thanks.
-
If its not too competitive then it shouldnt take you more than 30-60 days for a geo-targeted domain.
There is no case study to look at because each situation is so different.
-
Thank you, Gianluca. Your detailed response is much appreciated.
Would you be able to give any indication on the time it could take for the sub-domain to get all the search traffic directly for queries originating in that country?
Any case studies or references you will be able to point me to? That'd be great.
-
Thank you for your response; it's helpful.
By any chance, are you able to point me to any case study that shows the time it took for the geo-targeted sub-domain to get all the traffic directly from the search engines?
Our concern with using a new TLD is the time it will take the domain to acquire authority and attract traffic of its own from the targeted geography.
-
Hi Manoj, in your case I suggest you to use the rel="alternate" hreflang="x" geotargeting tag, apart from targeting the subdomain to the desired country (and the main site set as "global").
The use of the rel=”alternate” hreflang=”x” is strongly suggested in the case a website as an “incomplete” international version for very different reasons:
- Template translated, but main content in a single language;
- Broadly similar content within a single language, but targeting different countries (i.e.: US, UK, Australia…)
But remember that Google suggests to use it also in the case the site content is fully translated (i.e.: all the Spanish version has content in Spanish, and so on).
This rel, then, seems very appropriate for the Sitecore site.
How to implement it
Two options:
- HTML link element. In the section of any page.
In this case, for instance, in the section of www.domain.com we should add as many rel=”alternate” hreflang=”x” as the different country versions are present in the site.
I.e.: http://es.domain.com” />
Please note that if exist multiple language versions (“set” in the Google slang), every set must include the rel=”alternate” hreflang=”x” to every other language versions.
I.e.: if we Global, UK and FR versions of the site apart the Spanish one, the Spanish version will have to include:
Obviously, every single URL must have the rel=”alternate” hreflang=”x” tag pointing to the corresponding URL of any other language version.
- HTTP header, in the case of not-HTML files (as PDF)
As it is implicitly said, this tag is used on a page level, not domain one. That means that every single pages must be correctly marked-up
Same content and same language on different pages and language versions
If, as it happens in case, some pages show almost the same content in both the domain and subdomain, hence it is highly suggested to use also the rel=”canonical” in order to specify to Google what the preferred version of the URL is.
As Google itself says here, Google will “use that signal to focus on that version in search, while showing the local URLs to users where appropriate. For example, you could use this if you have the same product page in German, but want to target it separately to users searching on the Google properties for Germany, Austria, and Switzerland.”
Don't forget
Don't forget that your main site is set a targeting all the web, also the country targeted by your sub-domain.
That means that if you will perform an active link building campaign for the sub-domain, in order to provide it of an equal if not higher strenght respect the main site.
-
As soon as they index it it will take precedence in that country for geotargeting. You can increase the likelihood of differentiation or non duplicate content by using top level domains and by adding geotargeting keywords to your sub domain content. See the specific examples below:
Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We're more likely to know that
http://www.example.de
contains Germany-focused content, for instance, thanhttp://www.example.com/de
orhttp://de.example.com
.Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Source for above comes from google on duplicate content relating to different countries.
Hope this helps.....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I redesigned a clients website and there is a pretty massive drop in traffic - despite my efforts to significantly improve SEO.
Hi there, I redesigned a clients website that was very old fashioned and was not responsive. I implemented 301 redirects, kept the content pretty similar, website linking structure very similar - the only things i changed was making the website responsive, improved title tags, added a bit more information, improved the footer and h1 tags etc.. however although clicks are fairly similar search impressions have dropped about 60% on average over the past week. The old site had some keywords linking to pages with no new content so i removed those as seemed like black hat seo tricks and also there was a huge list of "locations we deliver to" on the homepage followed by around 500 citys/towns I removed this. Could this be the cause for the drop? as i assumed those would do more harm than good? Fairly new with SEO as you can probably tell. Looking for advice on what may be the cause and what steps I should take now. Thanks for reading! duGeW
White Hat / Black Hat SEO | | binkez321 -
Not getting any data in Search console
Hi there My website Ranking well, But in Search console it is not Fetching any Data, here is Screenshot http://prntscr.com/d4m2tz , Why i am not getting any report For Clicks, Impressions ?? is there any mistake which is made?? please any body can help out. Thanx,
White Hat / Black Hat SEO | | pooja.verify050 -
Moving website and domain name without 301 Redirect or rel=canonical
I do not wish to draw attention to my company, so I am using code names. For the sake of this discussion, we are a new car dealership representing Brand X Cars. The manufacturer of Brand X Cars pushes its dealers toward a website hosting company called CarWebsites in order to maintain a level of quality and control with each dealer. However, we have found the platform to be too restricting, and are switching to our own WordPress site. Unfortunately Brand X is claiming ownership of our original domain, BrandXCarDealer.net, so we have switched to BrandXCarDealer.com (which we prefer anyways). Now both websites are running, and there is duplicate content of everything. Brand X is not cooperative and will not 301 redirect to the new site, and we do not have access to the of the website for a rel=canonical. Brand X is also dragging its feet on shutting down BrandXCarDealer.net. We do still have access to change the content of the pages on the BrandXCarDealer.net site, but that is pretty much as far as our control goes. So my question is, is there anything we can do, without using a 301 redirect or rel=canonical, to tell Google to pay attention to the new BrandXCarDealer.com rather than the old BrandXCarDealer.net? Any suggestions are appreciated. Thanks!
White Hat / Black Hat SEO | | VanMaster0 -
URL disappeared from the search results
Hey folks, A URL on my webpage that has been climbing in search results ever since has suddenly completely disapeared from the search results and i'm absolutely stuck - no idea what the reason might be. It was ranked #11 for the targeted keyword, than it slightly started dropping down to #14 and #17 after which it completely disappeared, not only for specific targeted keyword, but also for exact name of the product. The URL has vanished from search results. I looked in search console, no particular errors or messages from Google. The only case I might come with is that many URLs are cannonicaly linked to the URL in matter, but i don't assume this might be the case. Does anyone have a suggestion what might the reason why the URL has completely vanished from the search results? Thank you a lot. The URL: http://chemometec.com/cell-counters/cell-counter-nc-200-nucleocounter/ Targeted keyword: 'cell counter'
White Hat / Black Hat SEO | | Chemometec0 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
Would the same template landing page (placed on 50+ targeted domains) help or hurt my ranking?
Scenario: Company ABC has 50 related domains that are being forwarding to the main company URL. Q1: Would there be SEO value by creating a template landing page for each domain that includes product info, photos and keyword links to the main URL? Q2: If all 50+ landing pages were the same, would that penalize the main site due to duplicate content?
White Hat / Black Hat SEO | | brianmeert0 -
Can you block backlinks from another domain
Wondering if this is somehow possible. A site got hacked and created a /data folder with hundreds of .php files that are web pages selling all sorts of stuff. We deleted the /data folder and blocked Google from indexing it. Just noticed in Webmaster Tools that the site has 35,000 backlinks from other sites that got hacked with the same way. Is there a way to block these sites? I am assuming there isn't, but wanted to see if anyone ran into the same problem. It is a wordpress site is that helps.
White Hat / Black Hat SEO | | phatride0