Best way to merge 2 ecommerce sites
-
Our Client owns two ecommerce websites.
Website A sells 20 related brands. Website has improving search rank, but not normally on the second to fourth page of google.
Website B was purchased from a competitor. It has 1 brand (also sold on site A). Search results are normally high on the first page of google.
Client wants to consider merging the two sites. We are looking at options.
Option 1: Do nothing, site B dominates it’s brand, but this will not do anything to boost site A.
Option 2: keep both sites running, but put lots of canonical tags on site B pointing to site A
Option 3: close down site B and make a lot of 301 redirects to site A
Option 4: ???
Any thoughts on this would be great. We want to do this in a way that boosts site A as much as possible without losing sales on the one brand that site B sells.
-
I work on a big ecommerce client which purchased another ecommerce website, the strategy they eneded up going with was running both websites still as they had such a good name in the market and both had decent ranks. (Option 1)
We can also aim for SERP domination on specific keywords, yet I understand the clients request to take one site down.
I think if you implement cross site 301's and move content to the main site it is evident that their will be a loss in rankings seen. (options 3)
-
I will try and provide some useful information as a follow up.
Sales on site B are slowly dieing. The brand that this web site sells is no longer in business. It has great rankings mainly because no one is competing for this brand and it specializes in that brand. It has a good link profile from other site pointing to it as a place to get the brand it sells (as they are now hard to find).
Site A is the site that has the longer term potential an is trying to be built up.
If a merger can only help site A sell the brand of site B, then there is no reason to do it. The desire is to boost the overall rankings for site A.
-
In my opinion, any answers given without knowledge of sales and profit data are dangerous.
Also, relative strength of the sites and their competitive markets should be considered.
If either of these are important sites that support a business someone who really knows what tehy are doing should be studying the options - which are greater in number than 3 or 4.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup concerning category pages on an ecommerce site
We are adding json+ld data to an ecommerce site and myself and one of the other people working on the site are having a minor disagreement on things. What it comes down to is how to mark up the category page. One of us says it needs to be marked up with as an Itempage, https://schema.org/ItemPage The other says it needs to be marked up as products, with multiple product instances in the schema, https://schema.org/Product The main sticking point on the Itemlist is that Itemlist is a child of intangible, so there is a feeling that should be used for things like track listings or other arbitrary data.
Intermediate & Advanced SEO | | LesleyPaone2 -
What would be best way to transition from mobile website to responsive
We have a mobile website (mobile.website.com) that mirror our desktop site (www.website.com) with +100 000 pages. We have an alternate tag on our desktop to our mobile site and a user agent detect that redirect mobile traffic to our mobile site Our mobile site is no index and has a canonical to our desktop. Everything works pretty well, the mobile website is not index and only show up in SERP when a user make a search from a mobile. Our main website is now responsive and we would like to kill our mobile site without compromising our traffic. We know that a slight speed change or content change can affect our traffic, what would be the best way to do that? Big bang: redirect all mobile URL to desktop, remove user agent detect and remove alternate tag on desktop Semi Big bang: remove user agent detect and remove alternate tag on desktop and see how the traffic react before redirecting Progressive: remove the user agent detect and the alternate tag on some section of the website to see how the traffic react Other ? Anyone has any experience with that? Thanks and let me know if anything is not clear.
Intermediate & Advanced SEO | | Digitics0 -
Best way to show content from articles I am published/featured in
Hi. I was wondering what was the best way to show my audience articles that my client is featured in. My client is specifically a surgeon, who has been referenced in many articles around his specific field of cosmetic surgery. An idea posed is to repost the entire article but just reference back to the original article. Is there an SEO friendly way of doing this? I have seen this done before, like search engine journal's author Larry Kim might repost something he wrote or published on wordstream onto search engine journal sometimes, but makes the reference that it was originally posted on wordstream. I know the standard thinking is to always just write new and unique content, but there is already a good amount written about our client and referencing his work, how can we use this to our advantage and give new or prospecting patients information regarding his credibility? Our client really does not want us to write articles for him, and he does not have the time to write them either. Again Question: How can we leverage articles and studies that have already been published online that is featuring our client and show them in full onto our own website?
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0