Will having duplicate content on four websites cause a problem?
-
A client of ours has four websites for different shops they run in the surrounding area. Each website has original content as well as duplicate content. This is for things like product advice which needs to be the same
Will having duplicate content on these four websites cause a problem? How can it be mitigated? We can't refer the visitor to another website to get the product information as this will break the user experience, and of course shopping cart sessions will not pass on.
-
Hi Tom,
Are they on the same server and linking to each other? If so, then just canonical them. Im also assuming from what you said that it's just a few pages so it shouldn't be any problem. That, or just edit the pages a wee bit.
If they are on different servers and practically can't be traced to be related to each other, then I wouldn't even worry about it. Just being practical.
-
Hi Tom
It might also be worth checking if the clients e-commerce platforms will allow you to add tags into the head. Some ERP website or cloud based sites don't so worth checking from the get go
Bruce
-
Hi Tom
Short answer is that it probably won't be a problem. From what you are saying the duplication is 'natural' in the sense it is information which you might normally expect to see duplicated since it relates to similar products across multiple sites (also think privacy policy or terms and conditions pages). In this case it is unlikely to attract a penalty.
Matt Cutts covered that topic in this video (posted on SearchEngineLand): Duplicate Content Won't Hurt You Unless its Spammy.
However, it will probably mean you are leaving it up to the search engine to decide which 'version' of your duplicate content it should prioritise and serve up to people searching. If it is not important to rank for the content on these duplicate pages then again it is not really an issue.
However if you want to play safe or aim to get some rankings for a specific page - among all the duplicate versions, you can use a rel=canonical tag to let the search engines know which page is the "original" so that they will prioritise this one (ie point the link juice at a specific page). Matt Cutts talks about that in this video (although he talks about it in the context of a news article).
Check out this Moz article on Duplicate Content - it also has a short explanation on how to use the rel=canonical.
All the best
Neil
-
use the canonical tag it will let you keep the pages where they are (as well as visitors) but tell Google which page is the original. As for harm it depends on how many pages I wouldn't see it being the greatest harm but its so easy to put the tag on you might as well do that. A quick heads up though by putting the tag on pages it means one page will rank but the others will no so be aware of that.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How best to roll out updated website to new responsive layout
We are having an issue about when to upload newly converted pages to website. By going responsive we are basically going with a new layout, different look entirely. I think it's best to wait until every page in the site has been transferred over to the new layout. Partially because some urls are being updated too, and the look is different enough that it may look like half the site got hi-jacked. My partner thinks rolling out/uploading each page as it is complete is the way to go. Need input on pros and cons of either method.
Content Development | | Manifestation0 -
Content Syndication Strategies
Hi I'm currently doing some research into content syndication strategies for our new and existing client base. Does anyone on here have any case studies, better still, success stories that may help broaden my exposure within this topic? Many thanks Gary
Content Development | | GaryVictory0 -
Blog Content
I keep reading that a steady stream of new blogs from my site is a great way for getting inbound links to my site. My question is... Does the content of my blogs have to be relevant to my site? My site is www.marblerenovation.com. If the blog should stay relevant, I am finding it pretty hard to create engaging content around cleaning marble floors. Also, does anyone know of a good place to find bloggers to help create this content? Thanks in advance everyone Dave
Content Development | | david.smith.segarra0 -
In my website all the pages are not indexed by google..what to do for the same
In my website http://www.dubins.ae, all the pages are not indexed by google. How to make sure that all the pages are indexed by google?
Content Development | | Muna0 -
Filling Up Content For A New News Publishing Site
Hello, SEO Gurus. I have a client whom I've been working with for a few months now, and part of our service offering is to publish and promote fresh, daily content on his site's blog. This strategy has been a huge success thus far, he is very happy with the content, etc. Now, he is getting ready to launch a second site, which will be a news publishing site for his industry niche, and we will once again be providing the content on a daily basis: we're going to be producing 10 to 15 articles a day. It's a big operation for us. The client, however, is concerned that he doesn't want the site to appear "thin" on content in the early going, and asked if it would be possible to populate the new site with the articles we wrote on the other site's blog. My gut reaction to this is that it would be an exceedingly bad idea to do this. While we are the ones who authored the original content (and we've used author tags and publishing markup), the best bet is to simply start fresh. Besides that, seeing as we'll be pumping out tons of content on a daily basis, it won't take long to fill up the content coffers. That being said, I just wanted to run this past you all and see if anyone had any alternative ideas on how to use the old content without it being duplicate content. I was thinking that maybe designating all of the old articles with noindex, nofollow could be an option? Many thanks in advance for your time and attention. Sincerely, Mike
Content Development | | RCNOnlineMarketing0 -
Crawl error vs RSS feed content
Hi, I have a BuddyPress Multisite with sites dedicated to specific RSS feeds. I did this to pull in the content for my users. But they generate a 1000s of SEMOZ errors/warnings for dup content, dup titles, missing metatags. etc. So does keeping the content help my site SEO less than the errors? The content is not under my control so I have no way to relate it to my keywords. Any opinions? I can recreate these RSS sites, so I will probably delete them to see what happens. Larry
Content Development | | tishimself0 -
New Magento store with lots of duplicate contect
Hi, I am fairly new to SEOmoz and would really appreciate some pointers. I have just received the results of my first crawl and I have over 1800 pages of duplicate content. this is an example Cream poplin slim fit mens business shirt from Jermyn Street Shirt Co. http://jsshirts.com.au/aylesbury-cream-poplin-slim-fit-shirt.html Heading other URL's indicates 10 My questions are what does this figure 10 represent? what do I need to do to remove this duplicate content. Thanks | | | | |
Content Development | | mullsey
| | | | |
| | | | |0 -
Quality content distribution service?
I want to distribute my articles without having to go multiple sites to get the job done. What is the best quality and most reputable company to help me distribute my content?
Content Development | | photoseo10