Domain migration strategy
-
Imagine you have a large site on an aged and authoritative domain.
For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable.
The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain.
The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc.
My question is whether we should allow the new site to be indexed during the beta period?
My gut feeling is yes for the following reasons:
-
It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much.
-
Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects
Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?
-
-
I wouldn't sweat it. We left up www.bulwarkpest.com for several months while moving to www.bulwarkpestcontrol.com .... I know that there is some risk in it. But I think Google is pretty understanding of site migrations. Of course I am just a small pest control guy so they may not have ever noticed. Sooo.. take that with a grain of salt.
It's does make it easier to have the other site live so that you can redirect on a per page base and know that it's working. I would rather make sure the redirects are correct and working prior to moving the entire site over. But be warned.. site redirects may not always give you the same authority... research the online Yellow Pages.
-
My opinion of risk goes up much higher if this is a directory vs a site with original content articles.
-
Hi Aran, thanks for your response.
My thinking has also evolved a bit and I'm now thinking we ought to exclude the new site until we're ready to cut over as @EGOL suggested.
The critical info I didn't mention before was that there is important client ROI and reporting reasons that we need to ensure that the current site continues to perform right up until the cut over, at which point the 301s will be implemented. The cross domain canonical would address the dupliaction, but would also start to depreciate the current pages prematurely.
The thing that I was underestimating before was the negative impression that the new domain would give Google when it suddenly appeared with 1M+ pages of duplicate content plus no real link profile of its own (until we implement the 301s)...all the hallmarks of a scraper.
Better I think to avoid this by excluding the beta until we cut over, and make sure we prep well for that.
-
Agreed, though Charles could use canonical tags to tell Google that the new pages are authoritative. This may take a while to be indexed, but should prevent any detrimental effects with duplicate content.
-
Thanks very much for your thoughts. The root of my uncertainty is indeed the way Google in particular is viewing duplciated content today.
What if I told you that the site was a business directory and that the new site would be a big improvement in terms of on page optimization? By which I mean new/different (and much better) page titles and improved internal linking. I mention this only because the new site won't a direct replicar of the old one. Make a difference?
-
I have no factual data on this... just going with my gut....
Based upon how Google is acting these days I would not take chances with having two copies of the same site in the SERPs for an entire month. I would not want to see any pages on the new site filtered for being duplicates.
Most people don't get a new site indexed and those redirected domains normally go fairly well. So, I would be pleased with that and not take chances.
Safety might be better than going for some unknown gain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating Subfolder content to New domain Safely
Hello everyone, I'm currently facing a challenging situation and would greatly appreciate your expertise and guidance. I own a website, maniflexa.com, primarily focused on the digital agency niche. About 3 months ago, I created a subfolder, maniflexa.com/emploi/, dedicated to job listings which is a completely different niche. The subfolder has around 120 posts and pages. Unfortunately, since I created the subfolder, the rankings of my main site have been negatively impacted. I was previously ranking #1 for all local digital services keywords, but now, only 2 out of 16 keywords have maintained their positions. Other pages have dropped to positions 30 and beyond. I'm considering a solution and would like your advice: I'm planning to purchase a new domain and migrate the content from maniflexa.com/emploi/ to newdomain.com. However, I want to ensure a smooth migration without affecting the main domain maniflexa.com rankings and losing backlinks from maniflexa.com/emploi/ pages. Is moving the subfolder content to a new domain a viable solution? And how can I effectively redirect all pages from the subfolder to the new domain while preserving page ranks and backlinks?
Intermediate & Advanced SEO | | davidifaso
I wish they did, but GSC doesn't offer a solution to migration content from subfolder to a new domain. 😢 Help a fellow Mozer. Thanks for giving a hand.0 -
Migration developer question
Hi Guys, We are in the process of migrating our website and are moving to: AWS/Elastic Beanstalk hosting and the only way to do a custom domain with a third-party (not Amazon) DNS Service is by setting up a CNAME that points to the EBS Instance. Do you think this will impact SEO performance in any way? Cheers.
Intermediate & Advanced SEO | | cerednicenko0 -
Nice Domain Authority but Not Ranking
Hi, A client of mine who owns a website reached out to me. He got penalized a while ago and has long since recovered (not sure exactly, but for sure a year). His domain authority is in the upper 30s but is still not ranking for many of his keywords that he ranked on the first page. I am not so familiar with the technical aspects of penalties and such, but is this a common scenario? Why is his domain authority great but his ranking downright awful? Does he have a chance if he builds great links, or is something else wrong that we can't figure out?
Intermediate & Advanced SEO | | Rachel_J0 -
Domain Authority
Hi Our website Domain Authority isn't as high and was wondering why it's not increasing. Compared to 1 or 2 competitors we're not scoring as high as them. Are rankings are good for all chosen keywords. Just trying to get a better handle where our site is falling short on.
Intermediate & Advanced SEO | | Cocoonfxmedia1 -
Duplicate Sub-domains Being Indexed
Hi all, I have this site that has a sub-domain that is meant to be a "support" for clients. Some sort of FAQ pages, if you will. A lot of them are dynamic URLs, hence, the title and most of the content are duplicated. Crawl Diagnostics found 52 duplicate content, 138 duplicate title and a lot other errors. My question is, what would be the best practice to fix this issue? Should I noindex and nofollow all of its subdomains? Thanks in advance.
Intermediate & Advanced SEO | | EdwardDennis0 -
Domain authority vs. moz difficulty
what type relationship do you see with domain authority and moz difficulty scores? i'm finding a rule of "tens' usually applies.... meaning if da = 45, then difficulty scores of 40-50 are generally within short term reach (3-6 months of simple onpage optimization and an appropriate # ofinbound links to the page). your thoughts/data? just trying to get a feel for a consensus 🙂
Intermediate & Advanced SEO | | DonnieCooper0 -
Redirecting Powerful Domains
What do you do if you have a client that never implemented a 301 redirect on their domain? For example here are the OSE stats for the URLs; http://url.com PA: 48 DA: 50 LRD: 65 TL: 1,084 FB: 178 FB: 14 T:5 http://www.url.com PA: 51 DA: 50 LRD: 165 TL: 2,271 FB: 178 FB: 14 T:5 G+1:3 My first instincts are to redirect the first one to the second one, but is it too late for that? Will that screw up all of their established stats? Any input or examples of past experiences with this would be great.
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Do any of you regularly use expired domains?
I know there has been discussion on using expired domains in the past. This is not so much a question as to how to do it or whether it works, but rather I would love to see how many of you use this in your backlink strategy. I have a domain in a low to moderately competitive niche that ranks really well, mostly on the power of a couple of expired domains. I bought the domains, created a quick wordpress site and pointed some anchor texted links to the site. It took some time for the expired domains to regain their PR, but when they did, the benefit was great. I'm considering whether I want to do this with another domain of mine. On one hand, it's a relatively inexpensive way to get some good quality anchor texted links. But, on the other hand, something in it feels "immoral" or "sneaky" to me. What do you think?
Intermediate & Advanced SEO | | MarieHaynes0