Using both .co.uk and .com
-
Hello a client has launched a website with both the .com and .co.uk
The content is identical. I understand that you should add rel="alternate" hreflang="x" to the code. However, will there be a problem with the identical content? It would be hard to localise the content to one country. I understand why the client has got both domains, particularly the UK one but the actual content is not specific to one country. It is written for English speaking customers really.
Also what about links? In this case do you need to build two sets of links to make them both rank?
Thanks for any help.
-
They are two different sites and entities, hence no link to .com will have an influence to the .co.uk domain.
-
Thanks again. In terms of work and time with links, it looks like two sites - .co,uk and the .com mean twice the amount of work? or if links are built for the .co.uk site they will have an influence on the .com site too? Is this the case or are they treated as separate entitles? Sorry for the questions. T
-
Links can help with geotargeting. Getting UK links sends a strong localization signal. Just remember to build organic.
-
I would appreciate any help with the question about link building. If there are only links pointing to the .co.uk only is that a problem for the .com version? What has anyone else done in these circumstances?
Thanks very much.
-
Thanks Highland. Th evideo makes sense.
What about links? Do we need to build links to both domains?
-
This is actually the preferred method to do it. Duplicate content is normally a problem but, between domain.com and domain.cctld (.co.uk in your case) it's not a problem. Google recognizes that ccTLDs exist to send localization signals and it should rank your ccTLD better for google.co.uk than your .com
Matt Cutts talked about this a while back.
You do not need a language declaration unless they are different (which I don't think UK qualifies vs a standard English site)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using 302s to redirect pages returning in 6 months
We are doing a 2-phase site redesign (in order to meet a deadline). An entire section of the site will not be available in the first phase, but will come back in 6 months. The question is, do we use 301s or 302s for those pages that will be coming back in 6 months? Is there a time limit on what is considered "temporary"? thanks in advance!
Technical SEO | | Max_B0 -
Has anyone used a company to help promote their site
Hi, i receive around ten emails a day claiming they can help you get your site in the top ten in google, now i know most are a load of rubbish but i am just wondering if anyone has used any of these companies for a new site or an old site. I am about to launch a new site after xmas and i am just wondering if any of these companies are worth looking at to help promote the new site instead of doing all the ground work myself. Would love to know your thoughts
Technical SEO | | ClaireH-1848860 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
Block or remove pages using a robots.txt
I want to use robots.txt to prevent googlebot access the specific folder on the server, Please tell me if the syntax below is correct User-Agent: Googlebot Disallow: /folder/ I want to use robots.txt to prevent google image index the images of my website , Please tell me if the syntax below is correct User-agent: Googlebot-Image Disallow: /
Technical SEO | | semer0 -
Does anyone know how set up and use google plus for business
Hi i am trying to work out how to use google plus to increase brand awareness and to increase traffic to my site, but i am not sure how to do this. can anyone please give me step by step instructions on setting it up and using it to generate traffic please
Technical SEO | | ClaireH-1848860 -
.com domain is an iframe copy of a .net domain?
Hey folks, This one is over my head. I'm helping out a friend's dental office website (www.capitolperiodontal.com), and their home page source code points to the .net TLD for its content apparently: | | <title></span>http://www.capitolperiodontal.com/</title> http-equiv="content-type" content="text/html" /> rows="100%" id="dd_frameset_0001"> src="http://www.capitolperiodontal.net/" name="dd_content_0001" framespacing="0" frameborder="0" noresize="noresize" title="capitolperiodontal.com" /> <noframes></noframes> My idea was to load all the content from the .net to the .com, then redirect the .net to the .com as it has better domain authority and is, well a .com. Any insights what this iframe biz is all about and if my strategy above is ok? Many thanks folks! john
Technical SEO | | juanzo0070 -
Best way to use affiliate links
What is the best practice to use amazon affliate links in blog posts? I have read different opinions on this, and want to be sure I'm using best practices. I sometimes link to amazon with an affiliate link on some of my posts, and am working on a top ten Christmas gift ideas for Children born with Down syndrome with lots of affiliate links on it. I'm want to be sure I'm using best practices when adding links like this. Tanks!
Technical SEO | | NoahsDad0 -
Are these links on timeout.com passing any value?
http://www.timeout.com/competition/ALHDubrovnik?DCMP=EMC-Travel-2011-07-14 The links pass through an internal system e.g. www.timeout.com/extern_link/?http://www.alh.hr Or what is the benefit to timeout of doing this? Cheers S
Technical SEO | | firstconversion0