Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
-
Hi All!
So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic.
For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path.
We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain.
What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
-
Hello Pat,
I do not have experience merging a Ruby site with another type of site, but I think we are confusing issues here anyway. You can have content in a database that gets served up anywhere. That is, you can pull that content into ten different websites if you wanted to. The database issue is almost irrelevant to the SEO issues, which have mainly to do with loss of pagerank from URL changes, and possible duplicate content issues. A 301 redirect from the old URL to the new one would take care of both of these issues.
If you are unable to redirect all of the old content my suggestion would then be to figure out which URLs have external links and redirect those. Let all of the other ones return a 404 or 410 status code so those URLs will be removed from the index since the content will exist on the new URL and you don't want two URLs with the same content indexed simultaneously.
Please let us know if we have missunderstood the question or if we can provide more help with your original question. You may want to post your Ruby question in another thread to ensure the right people see it.
Thanks!
-
Hi Chris,
Sorry for the confusion. The plan is to merge both databases (our vulnerability database on our corporate site and our exploit database on our other website) into one and place them on a subdomain off of our corporate site. Right now the exploit database that is on our second website gets a LOT of traffic, it contributes about three quarters of the traffic to the domain. I would like to minimize the loss of traffic when placing this on this subdomain and looking for ways to do this.
@ryan - I am not sure exactly why, but our web producer told me that we need to use a subdomain and cannot put this on our domain. I will follow up with her to find out why.
Update - I guess one of the databases is written on a different platform (ruby) so it cannot be hosted on the same server - changes are harder to make as a result. I guess this could still be done however it may be a little harder to update - anybody have experience with this?
Thanks for the help guys!
Pat
-
Would like to offer an opinion but can't quite figure out what you're saying in paragraph 3.
-
Not quite sure that I understand the need to put these on a subdomain. Why not have both of these reside/exist on the corporate domain? One of them already exists on your corporate, so you can keep that database/search there, and then move the other over to a similar location. yes, that would require a ton of 301 redirects, but that should be ok given the scope of the project.
In my experience, moving to a new domain or even a subdomain, you always experience some traffic loss that never really comes back (unless you are naturally growing anyway). Keel the main company domain going, put everything under a folder off the root, dont worry about the subdomain issue.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does you page need to be unique to rank
What I mean by unique is : Let's imagine I want to rank one "seo ranking factors." In order to compete do I need to have (in terms of design) that is totally different than everything out there or can I rank with a page that is presented in a very similar way than everything out there but with different content. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301 redirects or will search engines know to remove the old 404 pages from the SERPs? We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
Intermediate & Advanced SEO | | GrandOptimizations0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
Switching domains, need tips and trick?
I currently have my sub-product hosted on a subdomain that I have, for example say it is: x.mysite.com. I plan to migrate and move this to mysite.com. I was planning to do this by having mysite.com pointing to x.mysite.com and do a 301 redirect x.mysite.com to mysite.com. Is it as simple as that? Any other consideration, tips and tricks that I should be aware of so that my SERP doesn't get messed up. Any articles or guides on this subject matter would be highly appreciated
Intermediate & Advanced SEO | | herlamba0 -
Our main domain has thousands of subdomains with same content (expired hosting), how should we handle it?
Hello, Our client allows users to create free-trial subdomains and once the trial expires, all the domains have the same page. If people stick, their own websites are hosted on the subdomain. Since all these expired trials subdomains have the same content and are linking towards the Homepage, should they be nofollows? Has anyone dealt with something similar? Thanks very much in advance,
Intermediate & Advanced SEO | | SCAILLE0 -
Duplicate on page content - Product descriptions - Should I Meta NOINDEX?
Hi, Our e-commerce store has a lot of product descriptions duplicated - Some of them are default manufacturer descriptions, some are descriptions because the colour of the product varies - so essentially the same product, just different colour. It is going to take a lot of man hours to get the unique content in place - would a Meta No INDEX on the dupe pages be ok for the moment and then I can lift that once we have unique content in place? I can't 301 or canonicalize these pages, as they are actually individual products in their own right, just dupe descriptions. Thanks, Ben
Intermediate & Advanced SEO | | bjs20101 -
Optimize root domain or a page in a sub directory?
Hi My root domain is already optimized for keywords, i would say branded keywords, which i do not really need, as the traffic from these does not give me any revenue ( mostly consists of our employees/returning visitors). Now i have run on page optimization for set of keywords for root domain which i like and got good grades (hurray!). But yet my website does not show up on search engines for those keywords. I have got pretty good link building done to my root domain but this is not done for all keywords (but done for branded keywords). It just happened, please do not ask why. So i decided to optimize inside pages in sub directory with new set of keywords i like. Starting with link building, giving anchor text on various other website linking to this particular page. These pages are not ranked in top 50 in google. Is that a good practice? or I would not need those branded keywords, hence should I re-optimize my root domain to suite my new keywords by giving less preference to branded keywords? Is this a good practice?
Intermediate & Advanced SEO | | MiddleEastSeo0