Global/international SEO campaign strategy with a single TLD
-
Hi All,
Have 3 seperate questions all relating to global/international SEO from a domain strategy point of view so will try to make them all short and 'to the point'.
The current URL is www.example.com. The site's content strategy and all marketing activity has always been for the UK. We're now launching in US with also long term plans to launch in other countries. Each country will have their own webmaster/conternt strategy/marketing team.
1st question
Which is better and why?
The US team are leaning towards (and rightly so) the folder approach as it will help the US section of the site benefit from existing domain authority, link profile and off-page SEO work already carried out to a route domain level. This will also not be regarded as a new site as it's www.example.com/us
On the flip side however the sub domain option although has no short term SEO benefits; will have a more sustainable SEO campaign for each country as they can be treated as individual sites/SEO campaigns. This also reduces some risk elements involved as each geo-specific team will only be concerned about their own sub-domain and not have route domain level control. I'm also aware that sub-domains will be treated as individual sites and therefore certain updates (such as Panda) will treat each sub-domain individually. So a possible negative impact on uk.example.com would not necessarily have an impact on us.example.com unless content strategy was the same.
2nd question
Assuming we decide to go for www.example.com/us (folder option). The site's current geo target market is currently set to UK on Google Webmaster Tools to route domain level. If www.example.com was set to UK and www.example.com/us was set to US on GWT, would there be a conflict? We want to ensure that the route domain level settings does NOT override any settings on folder level within the same domain. Based on an answer from a top contributer of Google Webmaster Central, setting www.example.com/us to US would not be in conflict with settings within route domain level but I would love to hear/read from somebody that had actually gone through the process.
3rd question
We're considering implementing geo DNS so a US visitor accessing www.example.com will be redirected to www.example.com/us (or www.us.example.com) based on their location from their IP address. Reason being is we're trying to avoid a splash page with a choice of countries (UK or US) on route level (homepage) which is very commonly used by most sites with multiple geo specific target markets. We would be assuming that somebody from North America would be looking for the US site and therefore redirecting the visitor automatically to www.example.com/us. The SEO implications are however that a 302 redirect will be used and therefore redirects used based on the visitors location will not pass link value from the homepage towards landing pages. The homepage currently has very strong link juice and the site's general navigational structure is pretty good allowing the link juice to flow through from the homepage.
-
Well... what about the links in the footer of the home page? I don't like them, but that would help your "home page" issue.
Or, without giving up with the IP detection, still offer the opportunity to choose the territorial version using a selector?
-
Thanks Gianluca for your response.
Regarding question/answer 3, it still doesn't resolve the issue of being able to pass link value from the homepage as it will not serve any files from route level. A 302 redirect would point to the relevant geo specific section of the site so all of the external links pointing to the homepage will reach a dead end due to a 302 redirect. The 'About' page is a good call for usability reasons but is nowhere near as strong as the homepage in terms of its backlinks profile and therefore would not resolve the issue.
-
Pardon me if I answer first to question 2, then to 3 and last to 1.
Question 2.
I would not worry about conflicts. If you assign to a subfolder a territory as target, Google is quite good in following the indication you give. Therefore, if the main domain .com is generally focused to co.uk and the /us sub-folder to the USA, Google will not have problem in targeting correctly the two. Just a suggestion: eventually specify in the header that the language used in .com us en-uk and in the /us is en-us... not that Google is really paying a strong attention to that signals, but I would use any possible additional suggestion of the target.
Question 3
You are right... but, if there aren't other reasons you don't want to show to an USA users the Uk version of the site, why don't you think to put the links to the other language/territory versions of the site in a "About us" page? In that page, editorially you could list the markets the site is active, and those link would be direct.
I say this also for an usability reason. I don't know how the site is going to be developed, but maybe it should be interesting for an usa user to read the british blog of the website...
Question 1
I agree with the USA guys: subfolder is better, because that way it starts his online life with a strong authority domain, so that any links the sub-folder will obtain will have a bigger power than if obtained being an sub-domain.
About your Panda fear.. .you are right to fear it, but the reaction - IMO - is not the correct one. If you fear Panda you don't have to react in a defensive way (going sub-domain), but in a pro-active and positive one: creating worthwhile and useful content, avoiding thin one, paying great attention to any duplicate issue and checking any other pandarank factor.
Ciao
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Few question about SEO
HI guys, I have few questions and I always find good answer here. I tried many SEO companies some very expensive and well known some with medium prices and some from India. I’m not an SEO expert but I always get the same things from SEO companies. They're saying you have to stay with us for few months before you’ll see any results. I completely understand however I don’t see the result on the end.1. What exactly Do I need SEO company for, after I do on page optimisation if they don’t work on proper backlinks. Just letting you know I’m getting content from other people.2. Is there something else which is really important after your page is optimised than backlinks? Or we should fully focus on get backlinks from customers, guest post, sharing on social media etc. to increase our DA and PA?3. Any advice about some individual or company who is good in backlink services?
Intermediate & Advanced SEO | | Lukas-ST
Thank youLukasThanks a lot.Lukas0 -
Domaim.com/jobs?location=10 is indexed, so is domain.com/jobs/sheffield
Whats the best way you'd tackle that problem? I'm inheriting a website and the old devs had multiple internal links pointing to domain.com/jobs?location=10 (plus a ton of other numbers assigned to locations) and so they've been indexed. I usually use WMTs parameter tool but I'm not sure what the best approach would be other than that. Any help would be appreciated!
Intermediate & Advanced SEO | | jasondexter0 -
Automotive part / OEM / Manufacturer numbers
Hi All, What's the best way to optimise pages for OE / Manufacturer Part numbers? Disclaimer: All part numbers in this post are fictional. I dont want this post out ranking my client for real part numbers 🙂 Take this for Throttle Body for example: WOODYS S-AB-Q.123.53G This is the main part number from WOODYS (the manufacturer). However, these are all variations of exactly the same product: Woodys 2.78972.11.0 Woodys 2.78972.16.0 Woodys 2.78972.20.0 Woodys 2.78972.26.0 Oh, and car brands use OE numbers for these parts, such as: VWA 9808e40923G VWA 9808e40923L VWA 9808e40923M VWA 9808e40923P VWA 9808e40923Q These internal part numbers are vitally important as most of my clients customers are garages/mechanics so they're very likely to search on OE numbers. So, would you suggest: Optimising 10 different pages for the same product (using the part numbers in the URL, Title and H1). The problem is there's no unique content for these pages, only the part number varies, so this would likely get penalised for dupe content, or not enough unique content. Optimising one page for all terms. If so, how do you suggest doing this to ensure all part/OE numbers rank well and part numbers are prominent in the SERPS?
Intermediate & Advanced SEO | | seowoody
Could Schema.org help here by marking up these EO numbers with the isSimilarTo property of the Product type? I'm trying to ensure these part number get equal presence in the SERP snippet when searched for, even though I can't physically include all these numbers in the Title tag, URL and H1 of one page. 3. Something else? Thanks, Woody 🙂1 -
Victim of negative SEO
Hello, I'm one of those people that got the "unnatural links" message from google. Since i run my site from the very first day and always was a one man business i know all the ins and outs of the website.
Intermediate & Advanced SEO | | Henkiepenkie
Basicly i have never ever used a seo company to submit my site anywhere.
Never ever i used any shady tactic to get better ranks.
The only thing i did was trading traffic/links with real active websites, being daily updated, high traffic, etc etc. So. After i got notified in webmaster tools i started digging into a list (provided by webmaster tools) to find out why i got this message (meanwhile i lost 75% of my se traffic in past 3 months) After searching for hours i found out that in the past 2 / 3 months my site has been spammed on about 150 forums (completely dead forums with nothing but spam on it) There was no logic at all since my site got linked with the most ridiculous and unlogical word phrases.
My site is adult related and all spam links contained word phrases like "bangladesh mobile" or "rock girl" or "animal abuse" or "ringtones" etc etc. if i ever would be so stupid to start spamming my 7 year old strong business i would at least use titles that would make sence. anyway.
I can't do nothing about these forums, i don't own them, i can't erase them and if there are any owners, they simply don't respond.
I made a list of all the forums and send it to google but the only response they come up with is "there are still unnatural links". I hardly believe they did anything with that list i send them. I don't know what else i can do and was hoping that somebody could advice me on what to do here besides sending google messages on a daily base. cheers0 -
Is this Negative SEO?
Hello Everyone, I have just spent the past 9 months designing, engineering, and manufacturing our first product. We just opened our web store and started selling product. http://miveu.com. I have spent zero time doing any kind of SEO. We haven't even put up a sitemap yet or any redirects. I'm just now starting to take a look at things. As soon as I start digging, I find that it appears that someone is at least attempting to do some kind of negative SEO against us. It seems to have started about a month ago. Check this out. https://www.google.com/search?q=miveu&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-beta#q=miveu&hl=en&client=firefox-beta&hs=bo2&tbo=1&rls=org.mozilla:en-US:official&prmd=imvns&source=lnt&tbs=qdr:d&sa=X&psj=1&ei=AGgBUJfJNK650QHW8YW-Bw&ved=0CE0QpwUoAg&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=335379d2f3ac2208&biw=993&bih=637 At first I was thinking this isn't so good, but it seems they are just trying to build crap content about our keywords and make it relevant to us. After taking a closer look, I'm thinking maybe this isn't all bad. They have targeted all of our exiting YouTube videos and created new videos that use all of our keywords, titles, people, etc in an effort to make our existing videos irrelevant. They have have also done the same thing with articles that were written about us, awards we have won as well as started negative campaigns about us and people who have said good things about us. Here are my thoughts. While the content is really crappy, it seems like they are actually building keyword relevance to us and our products. They have all the right keywords, the content is just crappy. "There is no such thing as bad press". I don't know if anyone has ever said this before, but I'm going to refer to their effort as "White-Hate SEO" because it doesn't appear to be a real dark effort. Am I missing something here, am I way off base? My bigger worry is that their campaign may include some much darker efforts that I just haven't found yet. I'm pretty sure I know who is responsible for this. They have made it clear that they really do hate us. Frankly, I'm not interested in retaliation, I just want to get my own house in order with some good old-school whit-hat SEO. I'm really curious to hear your thoughts on this. Thanks!
Intermediate & Advanced SEO | | dmac
David0 -
Internal competition
If we have two different sub domain pages that talk about the same service but with different content, how will Google react while ranking? Example : xxxx.abc.com/company/solutions/service1 yyyy.abc.com/service1 Suppose if www.abc.com has good authority, which URL will be more benefited?
Intermediate & Advanced SEO | | gmk15670 -
Is this a good strategy?
Okay, so let's say I have a landing page or an ecommerce website with limited content. If I start a blog and write quality posts that have anchor text linking back to my homepage, then bookmark the hell out of those blog posts, post to twitter, cite the post on Q&A websites, etc . . . would that be an effective strategy beyond the normal stuff like directory submisson and blog comments?
Intermediate & Advanced SEO | | DanHenry0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280