Subdomain optimization - advices
-
Hi,
I need some specific advices on which is the best way to optimize the subdomain of a main domain. Besides meta title, description, etc.
Br.
-
Hi Tormar!
I agree with Gaston—the subdomain will be seen as a separate domain, so you should really just optimize it as you would any other domain.
Is there anything more specific you're looking for, here?
-
Hi Tormar,
Remember that googles treats subdomains as an URL, separare from the main domain. What I mean:
acb.domain.com, domain.com and www.domain.com all three are different URLs for google, even more, are different web pages. So there isn't any specific optimization when it's about a subdomain.
My advice: Optimize it as is it a separate site from the main domain.
Hope it's helpful.
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Orphan Duplicate is created as Subdomain in Google Search
We noticed that some of our results on google for the blog are also come up with subdomain that is not linked from anywhere on the website. For example: SUBDOMAIN1.website.com/blog/content.html -> it redirects to website.com/blog/content.html SUBDOMAIN1 is not linked anywhere on the website. How did the google find it in the first place? Why does it still keep it in the search results? How do you get rid of it?
Intermediate & Advanced SEO | | rkdc0 -
Robots.txt advice
Hey Guys, Have you ever seen coding like this in a robots.txt, I have never seen a noindex rule in a robots.txt file before - have you? user-agent: AhrefsBot User-agent: trovitBot
Intermediate & Advanced SEO | | eLab_London
User-agent: Nutch
User-agent: Baiduspider
Disallow: / User-agent: *
Disallow: /WebServices/
Disallow: /*?notfound=
Disallow: /?list=
Noindex: /?*list=
Noindex: /local/
Disallow: /local/
Noindex: /handle/
Disallow: /handle/
Noindex: /Handle/
Disallow: /Handle/
Noindex: /localsites/
Disallow: /localsites/
Noindex: /search/
Disallow: /search/
Noindex: /Search/
Disallow: /Search/
Disallow: ? I have never seen a noindex rule in a robots.txt file before - have you?
Any pointers?0 -
Subdomain Place Holder
So long story short - we are rolling out a new website earlier than expected. Unfortunately, we are being rushed and in order to make the deadline, we have decided to create a www2. subdomain and release our HTML only version of the site for the next 2 weeks. During that time, the HTML site will be ported over to a Drupal 8 instance, and resume its www. domain. My question is - will a temporary (302) from www to ww2 and then back to www screw the proverbial pooch? Is there a better way to implement a temporary site? Feel free to probe with some questions - I know I could be clearer here 😉 Thanks community!
Intermediate & Advanced SEO | | BDS20160 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
I Need to put static text every page (600 words) need advice
i need to put static text (about our company brief 600 words) to all content section of pages of our website. I know it's bad for SEO Duplicate Content. But i need to tell google this is my static content and do NOT crawl it. Or something like that. canonical is for whole page but i need to set it up for certain positions of every page. is that possible?
Intermediate & Advanced SEO | | nopsts0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Better to have one subdomain or multiple subdomains?
We have quite a bit of content we are considering subdomaining. There are about 13 topic centers that could deserve their own subdomain, but there are about 2000 original articles that we also want to subdomain. We are considering a) putting the 13 centers (i.e. babies.domain.com, acne.domain.com, etc) and the 3000 articles (on a variety of topics) on one subdomain b) putting the 13 centers on their own subdomain and the remaining 3000 articles on their own subdomain as well (14 subdomains total) What do you think is the best solution and why?
Intermediate & Advanced SEO | | nicole.healthline0 -
Does anyone have any tips for optimizing your Google Product Feeds?
How often do you submit them? What have you seen work? Are there any tricks aside from filling out all of the data fields?
Intermediate & Advanced SEO | | eric_since1910.com1