Any experience regarding what % is considered duplicate?
-
Some sites (including 1 or two I work with) have a legitimate reason to have duplicate content, such as product descriptions. One way to deal with duplicate content is to add other unique content to the page.
It would be helpful to have guidelines regarding what percentage of the content on a page should be unique. For example, if you have a page with 1,000 words of duplicate content, how many words of unique content should you add for the page to be considered OK?
I realize that a) Google will never reveal this and b) it probably varies a fair bit based on the particular website. However...
Does anyone have any experience in this area?
(Example: You added 300 words of unique content to all 250 pages on your site, that each had 100 words of duplicate content before, and that worked to improve your rankings.)
Any input would be appreciated!
Note: Just to be clear, I am NOT talking about "spinning" duplicate content to make it "unique". I am talking about adding unique content to a page that has legitimate duplicate content.
-
Check out this video: http://www.seomoz.org/blog/whiteboard-friday-dealing-with-duplicate-content
It will give you a much more thorough answer than just a percentage of uniqueness. But if you want that kind of answer I mostly hear guesses between 20% - 40% unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Duplicate content - Images & Attachments
I have been looking a GWT HTML improvements on our new site and I am scratching my head on how to stop some elements of the website showing up as duplicates for Meta Descriptions and Titles. For example the blog area: <a id="zip_0-anchor" class="zippedsection_title"></a>This blog is full of information and resources for you to implement; get more traffic, more leads an /blog//blog/page/2//blog/page/3//blog/page/4//blog/page/6//blog/page/9/The page has rel canonicals on them (using Yoast Wordpress SEO) and I can't see away of stopping the duplicate content. Can anyone suggest how to combat this? or is there nothing to worry about?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Schema.org mark up to avoid duplicate issue?
Hey there, I was wondering, does product's mark-up help to avoid penalization due to duplicate content? Here is the example: one of my client doesn't supply unique content. Because the major part of the content is technical description of products made by a couple of manufactures, do you think it will help me to link the official manufacturer webpage in a schena.org product mark-up? I know this is the right procedure to add mark-ups, but as on the pages of my client an outbound-link will show up, so I want to tell him this will be the only way to have that duplicate content without incurring in penalisation. I'd like to give him more than one solution, as I'm pretty sure it will never supply us with unique content. Thanks Pierpaolo
Intermediate & Advanced SEO | | madcow780 -
How To Detect Primary Site With Duplicate Domains?
I'm working with some backlink data, and I've run into different domains that host the same exact content on the same IP. They're not redirecting to each other, just looks like they're hosting the same content on differnet virtual hostnames. One example is: borealcanada.ca borealcanada.com borealcanada.org www.borealcanada.ca www.borealcanada.com www.borealcanada.org www.borealecanada.ca I'm trying to consolidate this data and choose which is the primary domain. In this example, it appears www.borealcanada.ca has a high number of indexed pages and also ranks first for "boreal canada". However, I'm trying to think of a metric I can use to definitively/systematically handle this (using SEO Tools or something like it). Anyone have ideas on which metric might help me determine this for a large number of sites?
Intermediate & Advanced SEO | | brettgus0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Duplicate Content Warning For Pages That Do Not Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Intermediate & Advanced SEO | | southcoasthost0 -
Trying to determine if either of these are considered cloaking
Option 1) In the browser, we use javascript to determine if you meet the redirect conditions (referrer not mydomain.com and no bypassing query-string). If so, then we direct your browser to the subdomain.mydomain.com URL. Googlebot would presumably get the original page. Option 2) In the browser, we use javascript to determine if you meet the redirect conditions. If so, we trigger different CSS that hides certain components of the page and use javascript to load in extra ads. Googlebt would get the unaltered page. In both scenarios the page content does not change. However, the presentation is different. The idea is that under certain conditions users are redirected to a page with more ads. The ads are not too severe on the redirected page and will not cause an above the fold penalty. That said, will either option be considered cloaking by Google?
Intermediate & Advanced SEO | | BostonWright0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0