Subdomain or subdirectory
-
We're a big social networking site with over 1 million indexed pages and over 4 million visits a month. Our PR is 7.
We're about to acquire and rebrand the content of a large reviews website, current PR 3. The new content will be treated as a 'site within a site' with different navigation and interface.
With these factors in mind I think we need to create a new subdomain for the reviews site but I need to factor in the SEO implications, bearing in mind that new advertisers are going to be looking closely at our stats.
Migrating the content to a new subdomain I understand will be easier than siting it in a new folder.
Any advice appreciated
-
So despite the different look and navigation you recommend migrating the whole site to a subfolder on our subdomain?
If you own this site now I would suggest you take some time to investigate branding. A standard look and navigation is desired. On the other hand, you need to be careful making changes to successful platforms.
Based on what you have shared, the likely recommendation would be to use a subfolder, but a proper recommendation on such an important topic cannot be offered in a Q&A. A SEO would need to learn about your company, your business, your niche, the site you are absorbing, etc. prior to making a decision.
-
It's an existing reviews website with pretty good quality content -- and a good fit for our demographic.
We will re-brand the content and our users will add their own reviews.
So despite the different look and navigation you recommend migrating the whole site to a subfolder on our subdomain?
-
My main concern is the size of the new site and limits on the number of pages Google will crawl.
That should not be a concern. Google has the ability to crawl sites regardless of size. There are sites with millions of web pages. The largest forum site, http://www.gaiaonline.com/forum/index.php, has over 2 billion posts. I am not sure on the page count, but it's a lot.
Google will make whatever adjustments necessary to crawl quality content. The question is...do all of the pages on your site represent quality content? If your pages offer quality content, earn links, get tweeted / liked / +1'd, then Google will crawl them.
-
Thanks Ryan, really helpful and I'm starting to realise there are no definitive answers.
My main concern is the size of the new site and limits on the number of pages Google will crawl. This is not the only site within a site we're looking at.
-
It really depends on how closely you wish to integrate the reviews compared to your core social networking function. There are pros and cons to each approach. A great article on this topic:
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain cannibalization
Hi, I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example: Root domain: example.com
Intermediate & Advanced SEO | | Mat_C
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,... Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains. The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop. Thanks.0 -
How can I get Bing to index my subdomain correctly?
Hi guys, My website exists on a subdomain (i.e. https://website.subdomain.com) and is being indexed correctly on all search engines except Bing and Duck Duck Go, which list 'https://www.website.subdomain.com'. Unfortunately my subdomain isn't configured for www (the domain is out of my control), so searchers are seeing a server error when clicking on my homepage in the SERPs. I have verified the site successfully in Bing Webmaster Tools, but it still shows up incorrectly. Does anyone have any advice on how I could fix this issue? Thank you!
Intermediate & Advanced SEO | | cos20300 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
E-Commerce Multilanguage - Better on Subdomains?
Hi, We have an e-commerce store in English and Spanish - same products. URLs differ like this: ENGLISH:
Intermediate & Advanced SEO | | bjs2010
www.mydomain.com/en/manufacturer-sku-productnameinenglish.html SPANISH:
www.mydomain.com/es/manufacturer-sku-productnameinspanish.html All content on pages is translated, e.g, H1, Titles, keywords, descriptions and site content itself is in the language displayed. Is there a risk of similar or near dupe content here in the eyes of the big G? Would it be worth implementing different languages on subdomains or completely different domains? thank you B0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Rebuilding a site with pre-existing high authority subdomains
I'm rebuilding a real estate website with 4 subdomains that have Page Authorities between 45 and 50. Since it's a real estate website it has 20,000+ pages of unique (listing) content PER sub-domain. The subdomains are structured like: washington.xyzrealty.com and california.xyzrealty.com. The root domain has a ~50 Page Authority. The site is about 7 years old. My preference is to focus all of my efforts on the primary domain going forward, but I don't want to waste the power of the subdomains. I'm considering: 1. Putting blogs or community/city pages on the subdomains 2. 301 redirecting all of the existing pages to matching pages on the new root domain. 3. Any other ideas??
Intermediate & Advanced SEO | | jonathanwashburn0