Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using Yoast variables for meta content overwrite any pages that already have custom meta content?
The question is about the Yoast plugin for WP sites. Let's say I have a site with 200 pages and custom meta descriptions / title tags already in place for the top 30 pages. If I use the Yoast variable tool to complete meta content for the remaining pages (and make my Moz issue tracker look happier), will that only affect the pages without custom meta descriptions or will it overwrite even the pages with the custom meta content that I want? In this situation, I do want to keep the meta content that is already in place on select pages. Thanks! Zack
On-Page Optimization | | rootandbranch0 -
Exclude sorting options using nofollow to reduce duplicate content
I'm getting reports of duplicate content for pages that have different sorting options applied, e.g: /trips/dest/africa-and-middle-east/
On-Page Optimization | | benbrowning
/trips/dest/africa-and-middle-east/?sort=title&direction=asc&page=1
/trips/dest/africa-and-middle-east/?sort=title&direction=des&page=1 I have the added complication of having pagination combined with these sorting options. I also don't have the option of a view all page. I'm considering adding rel="nofollow" to the sorting controls so they are just taken out of the equation, then using rel="next" and rel="prev" to handle the pagination as per Google recommendations(using the default sorting options). Has anyone tried this approach, or have an opinion on whether it would work?0 -
I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.
Below are some of the urls highlighted as having duplicate content -
On-Page Optimization | | urahul
http://www.thehomesites.com/zip_details/76105
http://www.thehomesites.com/zip_details/44135
http://www.thehomesites.com/zip_details/75227
http://www.thehomesites.com/zip_details/94501 These are neighborhood reports generated for 4 different zip codes. We use a standard template to create these reports. What are some of the steps we can take to avoid these pages being categorized as duplicate content?0 -
Not sure if I need to be concerned with duplicate content plus too many links
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes. We seem to have a lot of duplicate content and duplicate titles. This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed? The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do? Many thanks.
On-Page Optimization | | Niamh20 -
Duplicate Page Content
Hi there, We keep getting duplicate page content issues. However, its not actually the same page.
On-Page Optimization | | HamiltonIsland
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content?0 -
Are these considered duplicates?
http://www.domain.com/blog/sample-blog-post/#more-0001 http://www.domain.com/blog/sample-blog-post/ The first URL is coming from a "click here" hyperlink from the excerpt of the 2nd URL in my homepage. Thanks in advance!
On-Page Optimization | | esiow20130 -
Creating a product per size causing duplicate content problems?
I have an e-commerce site and in order to receive a listing for each size and color in Google Merchant, I've created a new product for each size and color. The problem is that since I did this, the canonical tags aren't correct and there isn't a way to change them manually with the platform I'm on. I feel like this is one of the main reasons I've been dropping in the rankings. Should I delete all duplicate products? The system will take care of canonical tags automatically when creating a new size/color within the system (how it's supposed to be created) but the canonical tags become messy when I duplicate a product and edit the size/color to create a "whole new product". Here is an example of what I'm referring to: http://www.carbonconnection.com/search.php?search_query=nalini+rigel&x=0&y=0 (this problem actually isn't mine, it's a friend's but for the sake of simplicity and gaining a second opinion to be sure before he redoes all of his products, I'm asking as though it were my issue)
On-Page Optimization | | EmdeS0 -
Duplicate Content
Hi I have Duplicate content that i do sent understand 1 - www.example.dk 2- www.example.dk/ I thought i was the same page, whit and without the / Hope someone can help 🙂
On-Page Optimization | | seopeter290