Domain structure for US Local Sites
-
We are planning on opening localized versions of our website throughout the world and in the US.
For countries these websites will be:
etc....
For the US would it be better to add the states onto part of the domain name or use a sub-folder. What is the advantage/disadvantages of each?
Meaning, should it be:
or
-
Maybe you can use cookies to set the language, instead of hard coding it in the URL?
Subfolders are the optimal structure from an SEO perspective. With any other structure you will not benefit from the domain authority of having a single site, and will have to duplicate much of your SEO efforts.
-
Thanks for all the responses. I can't use subfolders because that is the structure of the languages on the site
www.site/fr
www.site/es
etc....
We are launching subsites for legal reasons so need to have them.
Any other suggestions?
-
When you do country level sites, you might only be doing a handful. But when you get to US States, you are probably eventually going to do all the 50 states. That said, that's too many websites. I totally do a sub-folder. No questions. On country level sites, I'd do CC TLDs if that makes sense for your business.
I hope this helps and clarifies.
-
It may make sense to have separate sites for the different countries since Google does favor country specific TLDs for searches within a country.
-
I would highly agree with Takeshi on this… you should try to focus on sub folders that should look something like this (http://ww.example.com/france) and for UK the URL should be something like this (http://www.examle.com/UK)
Using separate domains and sub domains are treated as separate domains to Google and your SEO efforts might not transfer from one to another… where as in sub folders your SEO effort will help one another and less work will give you more results in terms of more traffic, sales and SERP rankings…
-
The best practice would be to use subfolders.
Subdomains (nj.site.com) are treated by Google in many aspects as separate sites, so the domain authority from nj.site.com won't transfer to say ny.site.com. You would basically have to duplicate your SEO efforts across 50 subdomains and not benefit from having one strong domain authority site.
Different domains (site-nj.com) would be the same problem, except you would also have to pay for 50 different domain name registrations. Plus, Google has devalued exact match domains so that having keywords in the domain doesn't carry quite as much ranking benefit as it used to.
Bottom line, go with subfolders for the optimal SEO site structure.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
Unique URLs for each local office 301 to parent site
I have a dentist who has multiple locations and a unique domain for each location. www.DentalCareofLacey.com www.DentalCareofSumner.com www.DentalCareofVashon.com The current plan is to setup 301s to redirect to the parent site (www.atlasdentistry.com/locations/lacey). Would there be any negative impact if we continue to purchase a unique domain for each branch office and just set it as a 301 to the parent site? Does having too many 301 redirects look too spammy to Google?
Technical SEO | | marlattts0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
UK and US Targeting Simultaneously - Domain Setup and Duplicate Content?
I have a site that will be targeting the US and the UK. However, it will need to display slightly different content to the two. Should I use a .co.uk and a .com, or uk.themainsite.com for the UK, or themainsite.com/UK? This is of course setting up multiple country targeting within Google Webmaster Tools. Am I likely to run into duplicate content issues?
Technical SEO | | james4060 -
Is it better for our Blog to be blog.domain.tld or domain.tld/blog ?
I'd dread the answer being the latter rather than the former as we've spent two years building it blog.domain... However I noticed SEOmoz are domian.tld/blog and it got me thinking.... Cheers. R.
Technical SEO | | RobertChapman0 -
Can URL re writes fix the problem of critical content too deep in a sites structure?
Good morning from Wetherby UK 🙂 Ok imagine this scenario. You ask the developers to design a site where "offices to let" is on level two of a sites hierachy and so the URL would look like this: http://www.sandersonweatherall.co.uk/office-to-let. But Yikes when it goes live it ends up like this: http://www.sandersonweatherall.co.uk...s/residential/office-to-let Is a fix to this a URL re - write? Or is the only fix relocating the office to let content further up the site structure? Any insights welcome 🙂
Technical SEO | | Nightwing0 -
301 an old site to a newer site...
Hi First, to be upfront - these are not my websites, I'm asking because they are trying to compete in my niche. Here's the details, then the questions... There is a website that is a few months old with about 200 indexed pages and about 20 links, call this newsite.com There is a website that is a few years old with over 10,000 indexed pages and over 20,000 links, call this oldsite.com newsite.com acquired oldsite.com and set a 301 redirect so every page of oldsite.com is re-directed to the front page of newsite.com newsite.com & oldsite.com are on the same topic, the 301 occurred in the past week. Now, oldsite.com is out of the SERPs and newsite.com is pretty much ranking in the same spot (top 10) for the main term. Here are my questions; 1. The 10,000 pages on oldsite.com had plenty of internal links - they no longer exists, so I imagine when the dust settles - it will be like oldsite.com is a one page site that re-diretcts to newsite.com ... How long will a ranking boost last for? 2. With the re-direct setup to completely forget about the structure and content of oldsite.com, it's clear to me that it was setup to pass the 'Link Juice' from oldsite.com to newsite.com ... Do the major SE's see this as a form of SPAM (manipulating the rankings), or do they see it as a good way to combine two or more websites? 3. Does this work? Is everybody doing it? Should I be doing it? ... or are there better ways for me to combat this type of competition (eg we could make a lot of great content for the money spent buying oldsite.com - but we certainly wouldn't get such an immediate increase to traffic)?
Technical SEO | | RR5000