International Site Migration
-
Hi guys,
In the process of launching internationally ecommerce site (Magento CMS) for two different countries (Australia and US). Then later on expand to other countries like the UK, Canada, etc.
The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk
A lot of the content between these English based countries are the same. E.g. same product descriptions.
So in order to prevent duplication, from what I’ve read we will need to add Hreflang tags to every single page on the site?So for:
Australian pages:
United States pages:
Just wanted to make sure this is the correct strategy (will hreflang prevent duplicate content issues?) and anything else i should be considering?
Thankyou, Chris
-
Oops - just noticed that I had an error in the first reply and didn't check the correct code. Mea culpa.
-
No Dirk, the language must be en-GB, UK is not ISO valid (and that's a classic mistake people do with hreflang).
-
One sitemap is ok from technical perspective (as long as it doesn't become to long =>50K URLs). For reporting it is interesting to split per country - this way it's easier in WMT to check if you have indexing problems on a specific country version.
Dirk
PS There is a typo in my initial answer for UK - the language has to be en-uk & not en-us
-
Thanks guys,
Yeah Gianluca that was a mistake should be .com/uk thanks for picking it up though!
I was also wondering for the sitemap itself, is it fine to have one single sitemap across all the different countries?
It seems this is what Apple does:http://www.apple.com/sitemap.xml
-
Apart everything Dirk said in his answer, I suggest you to put the UK version under the .com domain name.
In fact, if you put your UK version under an /uk/ subfolder in .com.au, you won't be able to geo-target the subfolder itself via Google Search Console, because Google will geotarget it automatically toward Australia.
So, your UK version will have very big problems being properly visible in google.co.uk.
If we want to use subfolders for targeting different countries (or languages), it is always better - and somehow an obligation - to use a generic domain name.
p.s.: my comment is caused by what you write here:
The plan is for each country will have its own sub-folder e.g. www.domain.com/us, www.domain.com.au/au, www.domain.com.au/uk
I don't know if that was a mistake or a mixup or it was correct
-
Hreflang will certainly help -however it's a bit confusing how you put in in your question. Hreflang is not put on domain but on page level:
Australian pages:
United States pages:
UK pages:
=> hreflang needs to be put on every page (you can test some sample pages here: http://flang.dejanseo.com.au/
Apart from hreflang - register each folder in WMT & target it to the specific country.
So set domain.com/uk/ to specifically target UK.Apart from that - make sure you adapt the text to the "local" English (so UK english for uk...etc) & use proper currencies & provide local contact detailes. Build local links for each subfolder.
If you do this you should be fine
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google cache is showing my UK homepage site instead of the US homepage and ranking the UK site in US
Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR
Intermediate & Advanced SEO | | adzhass0 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | | kdaniels0 -
Internal nofollows?
We have a profile page on our site for members who join. The profile page has child pages that are simply more specific drill-downs of what you get on the main profile page. For example: /roger displays all of roger's posts, questions, and favorites and then there are /roger/posts, /roger/questions, /roger/favorites. Since the child pages contain subsets of the content on the main profile page, we canonical them back to the main profile page. Here's my question: The main profile page has navigation links to take you to the child pages. On /roger, there are links to: /roger/posts, /roger/questions, and /roger/favorites. Currently, we nofollow these links. Is this the right way to do it? It seems to me that it's a mistake, since the bots will still crawl those pages but will not transfer PR. What should we do instead: 1. Make the links js links so the child pages won't be crawled at all? 2. Make the links follow so that PR will flow (see Matt Cutts' advice here)? Apprehension about doing this: won't it dilute crawl budget (as opposed to #1)? 3. Something else? In case the question wasn't confusing enough... here's another piece: We also have a child page of the profile that is simply a list of members (/roger/friends). Since this page does not have any real content, we are currently noindex/nofollow -ing it and the link to this page is also nofollow. I'm thinking that there's a better solution for this as well. Would love your input!
Intermediate & Advanced SEO | | YairSpolter0 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
International SEO
Hi all, The company that I work for is planning to target some french (and some other foreign) keywords. The thing is, in our industry, you can't just hire someone to translate the content/pages. The pages have to be translated by an accredited translator. Here's the thing, it costs a LOT of money just to translate a few thousand words. So, the CEO decided to translate a few of our 'core' pages and SEO them to see if it brings results. My questions are, would it be possible from a technical point of view to simply translate a few pages? Would that cause a problem for the search engine crawlers? Would those pages be 'seen' as duplicates? Thanks in advance guys!
Intermediate & Advanced SEO | | EdwardDennis0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Best practice for site maps?
Is it necessary or good practice to list "static" site routes in the sitemap? I.e. /about, /faq, etc? Some large sites (e.g. Vimeo) only list the 'dynamic' URLs (in their case the actual videos). If there are urls NOT listed in a sitemap, will these continue to be indexed? What is the good practice for a sitemap index? When submitting a sitemap to e.g. Webmaster tools, can you just submit the index file (which links to secondary sitemaps)? Does it matter which order the individual sitemaps are listed in the index?
Intermediate & Advanced SEO | | shawn810 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770