Reverse proxy a successful blog from subdomain to subfolder?
-
I have an ecommerce site that we'll call confusedseo.com. I created a WordPress blog and CNAME'd it to blog.confusedseo.com. Since then, the blog has earned a PageRank of 3 and a decent amount of organic traffic.
I am considering a reverse proxy to forward blog.confusedseo.com to confusedseo.com/blog/. As I understand it, this will greatly help the "link juice" of the root domain. However, I'm concerned about any potential harm done to the existing SEO value of the blog. What, if anything, should I be doing to ensure that the reverse proxy doesn't hurt my "juice" rather than help it?
-
Hey, I have a question in this:
We have setup a seperate Google Analytics ID and Google Search Console Property for the sub-domain and then if we are using reverse proxy to keep it under sub-directory.
So what happens to the GA tracking and Google Search Console in this case?
You can read my full question here:
-
Hi there,
Im investigating the same reverse proxy solution for my eCommerce blog. was your implementation successful?
-
Canonical will pass link juice almost exactly like 301s will, so there's no harm in going that route. Matt Cutts explains that in this video: http://www.youtube.com/watch?v=zW5UL3lzBOA
You sound like you're good to go. You've got duplicate content worked out, and you've got a plan to retain link juice (canonical).
-
Since the subdomain does still exist live, someone doing a reverse proxy would need to take some steps to mitigate duplicate content issues. The first would be to set up the new permalinks and rel canonical tags via Wordpress and Yoast's SEO plugin (which rocks, btw). Then you would need to do the robots.txt/GWT steps that you quoted. If there's anything else that needs doing, I am definitely all ears before I attempt this.
-
Ah! I misunderstood the bit about reverse proxying. In that case... to be perfectly honest, I'm not sure.
When you setup a reverse proxy, what happens to the sub-domain? Does it go away or does it still exist live? If it remains live, you'd end up with a duplicate content issue.
EDIT >> I found this at the source you linked to (which answers my question) -->
"The next thing you can do is add a robots.txt file to the sub-domain that stops robots from indexing it. As Reverse Proxying keeps the requested URL the /blog/ URLs will use the robots.txt from the main domain rather than the sub-domain.
The final (and most extreme) thing you can do is to register Google Webmaster Tools for the sub-domain and remove it from the index. If you are doing this, you need to do it in conjunction with robots.txt."
-
Thanks for your response, Philip. My research indicates that a 301 redirect on a location that is being reverse proxied would result in an infinite loop. (source) I haven't tested it to confirm, though. Is that true?
-
You need to setup 301 redirects for ALL of the pages and posts on the blog sub-domain to their new locations in the sub-folder. This is very important. Without the proper redirects in place, you will lose all value from links pointing to the blog sub-domain, plus all the history, authority, and rankings that the pages have earned.
As for your reasoning to move it from a sub-domain to a sub-folder, I'm not sure you'll receive any sort of link juice boost on your root domain from doing this. Maybe someone else can prove me wrong/correct me...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting Homepage to Subdomain Bad or Good Idea??
I have a very old forum that still gets a lot of traffic, but when migrating over to another software that is cloud based we cannot redirect using same domain, SO the only option would to be to change the cname on a subdomain and then REDIRECT all the traffic from the ROOT domain permanently - would this be a bad move as the root domain wouldnt be used anymore as now its just setup to be redirected in order to use the software we need to use? Domain is 17 years old.
Technical SEO | | vbsk0 -
Www to non www on a .com/blog url
hi guys, I have had to reset my site from www to non-www. via htacces and this worked out just fine.However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. However, the /blog WordPress section will not redirect to the non-www. I have changed the config.php to non-www. Does anyone have an idea as to what I need to do to force the non-www in a folder installed blog http://5starweddingdirectory.com/ http://www.5starweddingdirectory.com/blog/ Regards T
Technical SEO | | Taiger0 -
I have consolidated my subdomains into subfolders; should i remove the subdomains also?
Hi, I have consolidated my website's many sub-domains into sub-folders. We had ~20 sub-domains. The sub-domains and the root domain shared the same code-base and sub-domain specific features were controlled using Apache's SetEnv directive in the httpd.conf includes. Now they are consolidated. I have redirected all of the sub-domains to the relevant sub-folder; so http://sub1.mysite.com now 301 redirects to http://www.mysite.com/sub1.The redirects happen in the htaccess file and all sub-domains and the root are still pointing to the same code-base. However, the Moz campaign tracker still occasionally tells me that i have optimisation opportunities at http://sub1.mysite.com and in both Bing and Google webmaster tools traffic to those sub-domains is mentioned. Should i delete the subdomains? Cheers
Technical SEO | | McCaldin0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Is it worth changing our blog post URL's?
We're considering changing the URL's for our blog posts and dropping the date information. Ex. http://spreecommerce.com/blog/2012/07/27/spree-1-1-3-released/ changes to http://spreecommerce.com/blog/spree-1-1-3-released/ Based on what I've learned here the new URL is better for SEO but since these pages already exist do we risk a minor loss of Google juice with 301 redirects? We have a sitemap for the blog posts so I imagine this wouldn't be too hard for Google to learn the new ones.
Technical SEO | | schof0 -
Facebook likes for a blog - are posts or root domain more important?
I've just added the facebook plugin for my wordpress blog. I've activated the like buttons on each post and I share the posts on facebook - so when someone likes it on facebook, it reflects on the blog post. In my blog sidebar I've added the Like widget. I can either put the URL for it as my blog root domain (not many likes yet) or my facebook page (lots of likes, so more impressive) I've been hearing about Likes etc counting towards SEO, so I'm wondering if anyone can tell me whether developing Likes for my blog root domain will have any significant benefit over the Likes I am getting for the individual posts?
Technical SEO | | AISFM0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0 -
Promoting a blog or a blog article
Hi what is the best way to promote a blog or a blog article. What i want to do is to find a site when i can put part of the article on that site and then have a link going to my blog for the article. Can anyone recommend any sites that do this please or the best ways to promote a new article from a blog
Technical SEO | | ClaireH-1848860