Duplicate Content across 4 domains
-
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical.
There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain.
I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain.
What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc.
Or would it better to use cross-domain canoncial tags?
Thanks
-
EGOL - thanks for advice. Yes there are lots of ilinks between the domains. Although I also have clients who have done this deliberately for perceived gain, I think in this case the client has made an honest mistake by simply applying their CMS (with relative links) to each of the domains they thought they should purchase. It's confused a bit further by one section using https with an absolute domain so users can end up migrating from one domain to another and from http to https!
As an SEO I also have inclination to 301 page-by-page to best ranking site. However, as I mentioned to Thomas (above) I think the client will probably want to go with their preferred domain and as such I'll 301 page-by-page to that one. I'll discuss with the client and post the outcome.
-
Thomas - thanks for your reply. All the domains are very similar - variations from the same core concept. I have a feeling that the client will want to standardise on their preferred domain and as you say it is the easiest to remember.
-
Lots of businesses build three sites because they think it is a good way to kickass on their competitors. It can be really hard or even impossible get get them to give up on that idea.
I am going to guess that you have a second issue... they have links on all of these sites point to each other because they think it will help their pagerank and SERP position. Really hard to talk people out of this practice too.
If I owned these five sites I would shut down four of them and redirect all of them page-by-page to my best ranking site... and it would become my preferred domain. There is something else that you might try to talk this client out of.
Good luck!
-
Wow... so many options. It is really hard to give a good answer without knowing what the domains are and the differences in the domain names themselves, but here are a few thoughts.
If the domains provide some distinct differences, whether that be target audience (grandmaroadbikes.com) or branding (ZippzRoadBikes.com) then I would start with the cross domain canonical tags. There are many sites that stand on their own while using cross-domain canonicals and maintaining their own brand.
How similar are the domain names? If the domains are too similar, and unique branding can not really be achieved, then I would consider the consolidation. My gut reaction as an advertiser is to stick with the easiest to say and type. I assume the domain being marketed offline wins that prize. I would focus on promoting that domain. Further, many people will loose a bit of trust if they type in www.AdvertisedDomain.com and get www.DomainUnAdvertised.com. Plus you don't want to cause brand confusion.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Sub domain on root domain
Hello,
Intermediate & Advanced SEO | | dror999
I have a question that I can't find a good answer on.
I have a site, actually a "portal"/ "directory" for service providers.
Now, for start, we opened every service provider own page on our site, but now we get a lot of applications from those providers that thy want sites from their own.
We want to make every service provider his own site, but on sub domain url. ( they don’t mind… its ok for them)
So, my site is www.exaple.com
There site will be: provider.exaple.com
Now I have two questions:
1. can it harm my site in SEO?
2. if one from those sub domain , punished by google because is owner do "black hat seo" , how it will affect the rood domin? It can make the root domain to get punished?
Thanks!!0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Domain authority dropped 4 points in a week!
I don't understand. I am continually blogging, optimizing, creating new landing pages, getting awesome inbound links... Traffic has even been up dramatically in the past two months. I checked the DA on Monday and it was 59 and today it is 55. What?? Help! http://www.howlatthemoon.com/
Intermediate & Advanced SEO | | howlusa1 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
Worldwide Stores - Duplicate Content Question
Hello, We recently added new store views for our primary domain for different countries. Our primary url: www.store.com Different Countries URLS: www.store.com/au www.store.com/uk www.store.com/nz www.store.com/es And so forth and so on. This resulted in an almost immediate rankings drop for several keywords which we feel is a result of duplicate content creation. We've thousands of pages on our primary site. We've assigned a "no follow" tags to all store views for now, and trying to roll back the changes we did. However, we've seen some stores launching in different countries with same content, but with a country specific extensions like .co.uk, .co.nz., .com.au. At this point, it appears we have three choices: 1. Remove/Change duplicate content in country specific urls/store views. 2. Launch using .co.uk, .com.au with duplicate content for now. 3. Launch using .co.uk, .com.au etc with fresh content for all stores. Please keep in mind, option 1, and 3 can get very expensive keeping hundreds of products in untested territories. Ideally, we would like test first and then scale. However, we'd like to avoid any duplicate penalties on our main domain. Thanks for your help and answers on the same.
Intermediate & Advanced SEO | | globaleyeglasses0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0