How different does content need to be to avoid a duplicate content penalty?
-
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
-
Thanks, everyone, for the responses. They were very helpful.
-
First of Google does not "penalize" you for duplicate content, unless you're doing it on a massive scale (Panda). If Google detects duplicate content on your site, it will only display one version of that content in the SERPs. Still not ideal, not quite a penalty.
Perhaps more importantly, why do you have multiple landing pages that differ by only 10 words? Aside from the duplicate content issue, how are you "optimizing" each page for different keywords? If you are just changing the title and the URL, then it's probably not worth it from an SEO or user perspective.
If you want to rank for multiple keywords, write rich content which is relevant for multiple keywords or create multiple pages which substantially different and specifically aimed at your target keywords. Changing 10-15 words isn't optimizing for anything.
-
Yes, those landing pages sound like they will be viewed as duplicate content with only 10 or so words different... unless you only have 25 words on each page (which would then be incredibly thin content). I've heard people say that a page should be a minimum of 60% different (No idea how that number was determined though) to avoid duplicate errors. At that point it becomes simpler and easier in most cases to write up completely new content for every page to avoid any issues.
-
TextMarketing is spot on!
Either re-writing from memory or having someone else write the content based on a generic layout are two ways around having duplicate content.
And just for some additional info, this what Google considers duplicate content: "Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar." and "...content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results."
Duplicate Content = BAD SEO and BAD User Experience.
Mike
-
I would worry that Goggle may find these pages to be duplicate content if there's only a 10 to 15 word difference. I would recommend re-writing each page from memory without looking at the other in order to help make differentiate the content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | | KarlBantleman0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Duplicate content problem?
Hello! I am not sure if this is a problem or if I am just making something too complicated. Here's the deal. I took on a client who has an existing site in something called homestead. Files cannot be downloaded, making it tricky to get out of homestead. The way it is set up is new sites are developed on subdomains of homestead.com, and then your chosen domain points to this subdomain. The designer who built it has kindly given me access to her account so that I can edit the site, but this is awkward. I want to move the site to its own account. However, to do so Homestead requires that I create a new subdomain and copy the files from one to the other. They don't have any way to redirect the prior subdomain to the new one. They recommend I do something in the html, since that is all I can access. Am I unnecessarily worried about the duplicate content consequences? My understanding is that now I will have two subdomains with the same exact content. True, over time I will be editing the new one. But you get what I'm sayin'. Thanks!
Technical SEO | | devbook90 -
How do I deal with my pages being seen as duplicate content by SeoMoz?
My Dashboard is giving my lots of warnings for duplicate content but it all seems to have something to do with the www and the slash / For example: http://www.ebow.ie/ is seen as having the same duplicate content as http:/ebow.ie/ and http://www.ebow.ie Alos lots to do with how Wordpress categorizes pages and tags that is driving me bonkers! Any help appreciated! Dave. seomoz.png
Technical SEO | | ebowdublin0 -
How to prevent duplicate content in archives?
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems. Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content. Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)? Thanks so much for any input!
Technical SEO | | naturalsociety0 -
Duplicate Content Issue within the Categories Area
We are in the process of building out a new website, it has been built in Drupal. Within the scan report from SEOMOZ Crawl Diagnostics and it look like I have a duplicate content issue. Example: We sell Vinyl Banners so we have many different templates one can use from within our Online Banner Builder Tool. We have broken them down via categories: Issue: Duplicate Page Content /categories/activities has 9 other URLS associated this issue, I have many others but this one will work for an example. Within this category we have multiple templates attached to this page. Each of the templates do not need their own page however we use this to pull the templates into one page onto the activities landing page. I am wondering if I need to nofollow, noindex each of those individule templates and just get the main top level category name indexed. Or is there a better way to do this to minimize the impact of Panda?
Technical SEO | | Ben-HPB0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0