Subdomain for a blog
-
My client has a site hosted with a company that allows very little customization including I am unable to add a blog to the site. As he has a fair amount of time & money invested in the site, he is reluctant to start over. So my question is this. His blog is currently hosted off site, would it benefit him if I had them add a cname or a record to show his blog at blog.mydomain.com? Or does Google recognize that it is still a separate site and treat it as such? Finally does it matter how they set it up cname, a record or redirect? This is definitely not my area of expertise (if that is not already obvious from the question!).
Thanks for your help!
Matthew
-
Glad to help! One day hopefully you will return the favor to someone in need.
-
It does. Thank you. That was pretty much my understanding but I just wanted to get the opinions of the experts! Thanks again.
-
blog.example.com will not enjoy the link juice as would example.com/blog. By using the subdomain, you are basically starting a new site.
Hopefully that answers your question.
-
That is exactly what I was asking although I asked it poorly so thank you for interpreting my question as well. The other reasons to do this would include what? branding issues?
-
Assuming I understand your question correctly (and admittedly I may not), you won't get search value from moving blog to sub-domain.
However, there may be other reasons to do this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting Homepage to Subdomain Bad or Good Idea??
I have a very old forum that still gets a lot of traffic, but when migrating over to another software that is cloud based we cannot redirect using same domain, SO the only option would to be to change the cname on a subdomain and then REDIRECT all the traffic from the ROOT domain permanently - would this be a bad move as the root domain wouldnt be used anymore as now its just setup to be redirected in order to use the software we need to use? Domain is 17 years old.
Technical SEO | | vbsk0 -
Product Subdomain Outranking "Marketing" Domains
Hello, Moz community! I have been puzzling about what to do for a client. Here is the challenge. The client's "product"/welcome page lives at www.client.com this page allows the visitor to select the country/informational site they want OR to login to their subdomain/install of the product. Google is choosing this www.client.com url as the main result for client brand searches. In a perfect world, for searchers in the US, we would get served the client's US version of the information/marketing site which lives at https://client.com/us, and so on for other country level content (also living in a directory for that country) It's a brand new client, we've done geo-targeting within the search console, and I'm kind of scared to rock the boat by de-indexing this www.client.com welcome screen. Any thoughts, ideas, potential solutions are so appreciated. THANKS! Thank you!
Technical SEO | | SimpleSearch0 -
Google rejected my reconsideration request of unnatural link manual action, and list one blog article twice as example?
Hi Moz Community, On April 22 my site received a manual action in Google Webmaster telling me it's caused by unnatural links. After some a deep cleaning of all the sitewide links, which I think is the major problem of my external links, I requested a reconsideration request on May 4. And Google rejected my reconsideration request of unnatural link manual action on May 29, and list one blog article twice as example, which is quite weird to me. Is it normal for Google to list one URL twice as example in the feedback? I don't quite see the reason for that. Does anybody have any idea about that? This is really quite frustrating to me. And to be honest, I don't see much problems about the article Google listed as well. Yeah it's all about our product and it has 3 do-follow links to our site. But it contains no words such as sponsor, advertisement, or rewards... And the blog itself is quite healthy as well. The post also get rather high engagement, with organic comments and shares. How did Google flag that out? I don't think it's possible that Google will go into all our site links one by one... Hope you guys can help me with that. Thanks in advance! Ben
Technical SEO | | Ben_fotor0 -
Will using query string in the URL and swapping H1s for filtered view of the blog impact SEO negatively?
This is a blog revamp we are trying to personalize the experience for 2 separate audiences.We are revamping our blog the user starts on the blog that shows all stories (first screen) then can filter to a more specific blog (ESG or News blog). The filtered version for ESG or the News blog is done through a query string in the URL. We also swap out the page’s H1s accordingly in this process, will this impact SEO negatively?
Technical SEO | | lina_digital0 -
Odd scenario: subdomain not indexed nor cached, reason?
hi all hopefully somebody can help me with this issue 🙂 6 months ago a number of pages hosted at a domain level have been moved to a subdomain level with 301redirects + some others were created from scratch ( at a subdomain level too). what happens is that not only the new urls at the subdomain level are not indexed nor cached, but the old urls are still indexed in google, although by clicking on them they bring to the new urls via 301 redirect. question is why having a 301 redirects to the new urls, no issues with robot.txt, metarobots etc, the new urls are still de-indexed? i might remind you that a few (100 pages or so) have been created from scratch, but they are also not indexed. the only issue found across the page is the no-cache line of code set as follow: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache i am not familiar with cache control lines. Can this be an issue from a correct indexing? thanks in advance Dario
Technical SEO | | Mrlocicero0 -
Temporary Redirect 302 to subdomain for a couple of weeks?
Hi, To prevent DDOS attack during the Olympic Game the admins will use a service call Site Shield by Akamai (http://www.akamai.com/html/solutions/site_shield.html). The thing is that they will have to redirect all the trafic to a subdomain instead of the main one (http://www.website.com instead of http://website.com) and this for a couple of week (no negotiation here, it's too late and they have no choice). Does a 302 will do the job? Will I loose authority? Does adding a canonical URL on every pages of the site removing the www from the URL will help? Should I do something on webmaster tool to help? Thanks.
Technical SEO | | TVFreak0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230