SEO issues with IP based content delivery
-
Hi,
I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site.
Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me:
1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites.
2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience.
3. You tell me
-
Just a thought as well to add to what everyone else said. Make sure you go into your Google webmasters and tell Google what country you want them to rank up for. I have had odd instances when a site with a co.uk extension will still rank up in the US for terms even though I don't want it to. So I advise you to set them.
Have a nice day.
-
normally I would have said keep only one site but in understanding what you've said you need to differentiate the sites substantially not just wording. Brandon differently to be different the whole reason the customer wants them not be the same is because the audience is not the same I'm from Germany I do understand the difference between being pitch something in Germany and in the United States where I am now and I do notice my own behavior patterns to those I'm far more likely to buy something with the .de as I know which German I know there will be no issues when I'm in the United States I am far more likely to purchase something with the .com TDL as I know I will not have problems hopefully. Differentiate the sites is much as you can one of them sounds like the US site should be rewritten.
I hope I've helped you,
Thomas Von Zickell
-
You have the option to show different content depending on users location. If you use PHP on your site you can use PHP GEOIP functions.
You can get your site personalised by country here: http://www.maxmind.com/en/geolocation_landing
-
Great response and I agree with Big. Large e-commerce sites with thousands of SKUs that often only have tiny difference from one product to the next are bound to have many similar product descriptions. Yes, Amazon is a perfect example. Google is smart enoug to know if what you are doing with your two sites is "an attempt to get 2 bites of the cherry." It's pretty clear that you are trying to serve the most appropriate content to the most appropriate audience. Content management would be easier with everything all on one site, but with the history of these sites, it's probably best to keep them as they are. INow, if you had two domains in either the US or UK that had all the same identical product pages, that would be an issue.
-
Well what i would do its very simple, just have it all in one site and block ip's from user in the US to UK and the other way around.
so you have on your site 2 flags US and UK and if a user from a UK try to go to US site you show a message that that this products not available that say " this option not available from your location" (i am not a copy writer so use your own words) or just block the prices for ip from a different country.
hope that help
-
Oh yeah!
Thanks keri, I really din't check the date
-
Hi Khem,
This question is over a year old. Your best bet is to start a new thread with just this question. Thanks!
-
I would suggest to run only one web site and then use the visitors IP address to insert relevant content into the site. So the web site content will change according to where the visitor comes from.
Creating multiple domains in same language with identical content might attract penalty.
Or else, do whatever you're thinking but ensure to keep the content unique, even if you're using IP delivery.
| |
-
So, you mean to say that being in a same industry, I can copy the whole content of any UK website and then can restrict UK people from accessing my website, as my target audience is in US.
Please advice
-
If you keep the two sites separate would you will be penalized by Google for having duplicate content? if this is the case how should you deal with this?
-
Personally i would simply redirect your visitors to the proper web site associated through their IP address. There a few of server side tools or plugins if you're using a blog to change the entire sites title and body content to reflect the differences between sites at a click of a button.
affportal.com has many such tools and insights to help you with this.
hope this helps.
-
I say keep them separate. The ccTLD (.co.uk - clearly shows geographic relevance) - now the .com, which is a global TLD can be targeted/biased to the USA by using theGeographic target in Google Webmasters (the .co.uk is already set to UK and cannot be set to point to another country - only 'unlisted'.)
Furthermore, I wouldn't want to lose any benefits of the aged and established .co.uk by merging it with the .com. You will never get 100% of the link juice back with 301 redirects - maybe 90% at best and after some weeks have passed and then when you consider you are already well establised with your .co.uk site you would be mad to mess with it without VERY GOOD REASON!
Can't you just not restrict shipping to the US on the UK site? I have 2 ecommerce sites setup this way (one.co.uk and one .com - (which operates on a dropship basis only - as we are UK based)
With regards to the duplicate content issue - I would look at the fact that Amazon.co.uk and Amazon.com have hundreds of thousands of product pages with the same/VERY similar content (descriptions etc) - and last time I check they were ranking pretty well ;o) - without the need to block users from certain locations - (they do of course pickup your IP address instead and SUGGEST(with a big flag and arrows that you visit the UK site instead when you select Amazon.com site from the UK) - They still restrict shipping of certain items should you persist and try and order anyway.
Your prices will also differ between £ and $ - as will the converted price - another clear indication that this isn't an attempt to get 2 bites of the cherry.
I would also move the .com site to a US based server too - as this helps with ranking anyway (server/website speed and location are factors)
Maybe bung a flag in your header graphics to further denote your geo-targeting?
Changing the spelling of UK/US variants is sensible anyway - though difficult to research initially - I spent some time battling between ize and ise!
Keep the .Co.Uk and .Com separate - state that you do not ship to the US from the UK site - restrict purchases accordingly (by shipping address). That should make it clear enough - hope that helps!
-
Hi Devaki,
Are you still deciding what to do here, or have you gone ahead and made a decision? Let us know if we can help you out anymore, or tell us what your decision was -- we'd be interested to hear what choice was made and how it's worked out.
Thanks!
-
Duplicate content shouldn't be an issue with regards maintaining a US and UK site, the search engines will decide what version to show. Admittedly I'm not 100% convinced the are perfect at doing this at the moment, but confident enough I would do it myself in this case, so don't worry about rewriting all your content (though remember UK and US are one nation divided by a single language).
As a precaution you could georoute customers by IP to either or, but remember googlebot will probably crawl from a US address so you might want to let theirs passed.
To be honest unless you have a reason to do it I wouldn't change the set up you currently have. Bringing them under one domain is going to be awkward (though possible) to show geospecific content and as I mentioned I don't believe you even need to rewrite the content.
Just build UK links to the .co.uk site and you should be alright for the most part.
Is there a particular reason you feel you need to change the set-up?
-
Sorry, I must not be awake. What is the problem? You have two sites with common products and you only offer certain promotions in the UK on the UK ccTLD (country code top level domain). Are these promos showing up on the US site?
-
You have a few options.
You could build out one website and whenever a visitor comes from a specific country you could show that visitor separate pieces of the site, different products, etc... but this is not easily done and is a nightmare to manage.
You can keep the two separate websites and focus on rewriting the content. This is my first option, if it were me. You have two websites that are in separate countries selling products that are the same but with different offers, discounts, currencies, etc... so this makes the most sense to have a clear line of separation. However, it shouldn't be too difficult to hire a freelancer writer to go through one of the websites and rewrite the content. Make it more relevant to that countries users, add in videos, helpful information, etc... you only have to rewrite content for one of the sites, that would make sure they are not full of dupe content. Then, you could down the road hire the same writer to optimize the content for the other website but approach it with different content that is just as relevant and you should have a win-win-win situation.
Does that make sense?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implications of using Marketing Automation landing pages vs on-site content
Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
Intermediate & Advanced SEO | | philremington
Phil1 -
Help article / Knowledge base SEO consideration
Hi everyone, I am in the process of building the knowledge base for our SaaS product and I am afraid it could impact us negatively on the SEO side because of: Thin content on pages containing short answers to specific questions Keyword cannibalisation between some of our blog articles and the knowledge base articles I didn't find much on the impact of knowledge bases on SEO when I searched on Google. So I'm hoping we can use this thread to share a few thoughts and best practices on this topic. Below is a bit more details on the issues I face, any tips on how to address them would be most welcome. 1. Thin content: Some articles will have thin content by design: the H1 will be a specific question and there will be only 2 or 3 lines of text answering it in the article. I think creating a dedicated article per question is better than grouping 20 questions on one article from a UX point of view, because this will enable us to direct users more quickly to the answer when they use the live search function inside the software (help widget) or on the knowledge base (saves them the need to scrolling a long article to find the answer). Now the issue is that this will result in lots of pages with thin content. A workaround could be to have both a detailed FAQ style page with all the questions and answers, and individual articles for each question on top of that. The FAQ style page could be indexed in Google while the individual articles would have either a noIndex directive or a rel canonical to the FAQ style page. Have any of you faced similar issues when setting-up your knowledge base? Which approach would you recommend? 2.Keyword cannibalisation: There will be, to some extend, a level of keyword cannibalisation between our blog articles (which rank well) and some of the knowledge base articles. While we want both types of articles to appear in search, we don't want the "How to do XYZ" blog article containing practical tips to compete with the "How to do XYZ in the software" knowledge base article. Do you have any advice on how to achieve that? Having a specific Schema.org (or equivalent) type of markup to differentiate between the 2 types of articles would have been ideal but I couldn't find anything relating to help articles specifically when I searched.
Intermediate & Advanced SEO | | tbps0 -
Canonical Issue with urls
I saw some urls of my site showing duplicate page content, duplicate page title issues on crawl reports. So I have set canonical url for every urls , that has dupicate content / page title. But still SeoMoz crawl test is showing issue. I am giving here one url with issue. The below given urls shown duplicate content and duplicate page title with some other urls all are given below. Checked URL http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7635 dup page content http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622&category_id=270&colors=Black_Tones&click=colors&ci=1
Intermediate & Advanced SEO | | trixmediainc
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 dup page Title http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7636&category_id=270&sizes=12x15,12x18&click=sizes
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7636
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622&category_id=270&colors=Black_Tones&click=colors&ci=1
http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 But I have set canonical url for all these urls already , that is :- http://www.cyrusrugs.com/bridge-traditional-area-rug-item-7622 This should actually solve the problem right ? Search engine should identify the canonical url as original url and only should consider that. Thanks0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
How do 302 redirects from Akamai content targeting impact SEO?
How do 302 redirects from Akamai content targeting impact SEO? I'm using Akamai content targeting to get people from countries and languages to the right place (eg www.abc.123 to redirect to www.abc.123/NL-nl/default.aspx where folks from the Netherlands get their localized site in dutch) and from the edge server closest to them. As far as I know Akamai doesn't allow me to use anything but a 302. Anyone run across this? is this 302 a problem? I did a fetch as googlebot on my main domain and all I see is the Akamai 302. I can't imagine this is the first time Akamai has run across this but I would like to know for sure.
Intermediate & Advanced SEO | | Positec0 -
Mobile SEO
Hi there, My website when searching via mobile is now showing the mobile version of the site in SERPs, well for quite sometime now to be honest, anyway the ranking in mobile are no different to what they are on desktop, is there actually anything I can do to influence my mobile SERPs? 9 times out of 10 it's desktop websites that are ranking about me in mobile search. Any help would be appreciated.
Intermediate & Advanced SEO | | Paul780