Should I use the Change of Address in Search Console when moving subdomains to subfolders?
-
We have several subdomains for various markets for our business. We are in the process of moving those subdomains to subfolders on the main site.
Example: boston.example.com will become example.com/boston
And seattle.example.com will become example.com/seattle and so on.
It's not truly a change of address, but should I use the change of address tool in GSC for all of these subdomains moving?
-
Hey there,
Since you are basically disabling the subdomains, there won't be any future data in GCS for them anymore.
I'd recommend to set up separate folders in GSC in the same way as it used to be for the subdomains. Then, you will be able to track each folder separately like before.
Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Change Best Practice
I'm changing the url of some old pages to see if I can't get a little more organic out of them. After changing the url, and maybe title/desc tags as well, I plan to have Google fetch them. How does Google know that the old url is 301'd to the new url and the new url is not just a page of duplicate content? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Whch Google Advanced Search Query To Use?
Hi basically i want to find sites which mention a specific exact keyword on the page e.g. "BMW" but the same keyword "BMW" is not contained in the title tag of the page. Is there a advanced search query to do this? I did try “BMW” Intitle:"-bmw" no luck. I do also have scrapebox if there is a way to do this through that. Cheers, Mark
Intermediate & Advanced SEO | | Mikey0080 -
An improved search box within the search results - Results?
Hello~ Does anyone have any positive traffic results to share since implementing this? Thanks! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Do Local Search Efforts (Citations, NAP, Reviews) have an impact on traditional organic search listings (without the A, B, C mapping icons), but rather the traditional listings?
Are citations, NAP, Reviews, and other local search efforts impact traditional SEO listings? Can one elaborate?
Intermediate & Advanced SEO | | JQC0 -
Subdomain blog vs. subfolder blog in 2013
So I've read the posts here: http://moz.com/community/q/subdomain-blog-vs-subfolder-blog-in-2013 and many others, Matt Cutts video, etc. Does anyone have direct experience that its still best practice to use the sub folder? (hopefully a moz employee can chime in?) I have a client looking to use hubspot. They are preaching with the Matt Cutts video. I'm in charge of SEO / marketing and am at odds with them now. I'd like to present the client with more info than "in my experience in the past I've seen subdirectories work." Any help? Articles? etc?
Intermediate & Advanced SEO | | no6thgear0 -
Sitemaps and subdomains
At the beginning of our life-cycle, we were just a wordpress blog. However, we just launched a product created in Ruby. Because we did not have time to put together an open source Ruby CMS platform, we left the blog in wordpress and app in rails. Thus our web app is at http://www.thesquarefoot.com and our blog is at http://blog.thesquarefoot.com. We did re-directs such that if the URL does not exist at www.thesquarefoot.com it automatically forwards to blog.thesquarefoot.com. What is the best way to handle sitemaps? Create one for blog.thesquarefoot.com and for http://www.thesquarefoot.com and submit them separately? We had landing pages like http://www.thesquarefoot.com/houston in wordpress, which ranked well for Find Houston commercial real estate, which have been replaced with a landing page in Ruby, so that URL works well. The url that was ranking well for this word is now at blog.thesquarefoot.com/houston/? Should i delete this page? I am worried if i do, we will lose ranking, since that was the actual page ranking, not the new one. Until we are able to create an open source Ruby CMS and move everything over to a sub-directory and have everything live in one place, I would love any advice on how to mitigate damage and not confuse Google. Thanks
Intermediate & Advanced SEO | | TheSquareFoot0 -
Block search engines from URLs created by internal search engine?
Hey guys, I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory. One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website. I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt? What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index? Thanks Kim
Intermediate & Advanced SEO | | Voonie0