How to publish duplicate content legitimately without Panda problems
-
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists.
Your visitors love these articles and columns but the search engines see them as duplicate content.
You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty.
So, you decide to continue publishing the content and use...
<meta name="robots" content="noindex, follow">
This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them.
I have two questions.....
-
If you use "noindex" will that be enough to prevent your site from being considered as a content farm?
-
Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
-
-
Good idea about attributing with rel=canonical.
Thanks!
-
Noindexing the syndicated articles should, in theory, minimize the likelihood of having a Panda problem, but it seems like Panda is constantly evolving. You will probably see some kind of drop in rankings as the number of indexed pages of for site will decrease. If you have say, 1000 pages total on the site and suddenly 900 are taken out of the index, this might be a problem. If it is a much smaller percentage of the site, you might not have a problem at all. Other than the number of indexed pages, I don't think you will have a problem once the syndicated stuff is noindexed.
It will probably take Google a while to re-index/un-index the pages, so hopefully it won't be a fast drop if there is one. In the long run, it is probably better to at least have the appearance of trying to do the right thing. Linking to the source, and maybe using rel=canonical tags to the original article would also be a good practice. -
Thank you, Nick.
We will be using the "noindex" only on the pages with syndicated content. This is a DreamWeaver site and it is easy to place the code on specific pages and does not use excerpts.
Do you still see a potential problem?
The question really is... "Could a site that contains a lot of syndicated content have a Panda problem if the pages that contain that content are noindexed?"
-
I am assuming you intend to use no index only on the duplicate content articles. Using no index on everything would also prevent your content from being indexed and found through Google.
If you are using Wordpress or something else that will allow showing excerpts, you could try making the article pages noindex and show only excerpts on the main page and category pages which would be indexed and followed. I think that would make the articles not appear in searches and avoid duplicate content penalties, while allowing the pages that show the excerpts to still be indexed and rank OK.
The idea here is that the pages showing the excerpts would have enough text to help the home and category pages to rank for the subject matter and hopefully not be seen as what it is - copied content.
You will probably eventually get caught by the Panda, but this may work as a temporary solution until you can get some original content mixed in.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
Simple duplicate content query
Hello Community, One of my clients runs a job board website. They are having some new framework installed which will lead to them having to delete all their jobs and re-add them. The same jobs will be re-posted but with a different reference number which in turn with change each URL. I believe this will cause significant duplicate content issues, I just thought I would get a second opinion on best practice for approaching a situation like this. Would a possible solution be to delete jobs gradually and 301 re-direct old URLs to new URLs? Many thanks in advance, Adam
Technical SEO | | SO_UK0 -
Home page duplicate content...
Hello all! I've just downloaded my first Moz crawl CSV and I noticed that the home page appears twice - one with an appending forward slash at the end: http://www.example.com
Technical SEO | | LiamMcArthur
http://www.example.com/ For any of my product and category pages that encounter this problem - it's automatically resolved with a canonical tag. Should I create the same canonical tag for my home page? rel="canonical" href="http://www.example.com" />0 -
Image centric site and duplicate content issues
We have a site that has very little text, the main purpose of the site is to allow users to find inspiration through images. 1000s of images come to us each week to be processed by our editorial team, so as part of our process we select a subset of the best images and process those with titles, alt text, tags, etc. We still host the other images and users can find them through galleries that link to the process and unprocessed image pages. Due to the lack of information on the unprocessed images, we are having lots of duplicate content issues (The layout of all the image pages are the same, and there isn't any unique text to differentiate the pages. The only changing factor is the image itself in each page) Any suggestions on how to resolve this issue, will be greatly appreciated.
Technical SEO | | wedlinkmedia0 -
Product Duplicate Content Issue with Google Shopping
I have a site with approx 20,000 products. These products are resold to hundreds of other companies and are fed from one database therefore the content is duplicated many many times. To overcome this, we are launching the site with noindex meta tags on all product pages. (In phase 2 we will begin adding unique content for every product eek) However, we still want them to appear in Google Shopping. Will this happen or will it have to wait until we remove the noindex tags?
Technical SEO | | FGroup0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0 -
Duplicate Content Penalties, International Sites
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
Technical SEO | | endlesspools0