Noindex duplicate content penalty?
-
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit.
Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site.
In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
-
Yes, I agree the ideal solution would be to make the content unique, however all being well, I will have about 20,000 threads and 50,000 posts added in a month. The other main reason for doing is it the forum script creates users as assigns posts to them so the forum will also seem to have about 5,000 active users.
Removing the duplicate content would be easy enough, can run an sql query and remove all posts before x date,
-
Do you really want to double your work? Parse and later remove forums content?
I think will be much better rewrite yahoo answers, of course it need more time and resources, but your content will be unique. And you've got search traffic much faster. It's ease to find cheap rewrites, who fill your forum very fast.
-
Maybe what you should do is add the rel="canonical" attribute on your page/thread to the corresponding Yahoo answers page. This will certainly tell Google who the "original owner" is. If you want to block from search engines also, keep the noindex and also block Googlebot in robots.txt for that sub directory.
-
Sorry, just thought of something else....
Instead of the no index would blocking google from the /forum/ directory in htaccess be even better? I'm guessing that it would. With noindex we are telling Google not to index the content but it is still reading it. With a block we are not even showing Google the bad content in the first place so it doesn't know there is any duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Why are some pages now duplicate content?
It is probably a silly question, but all of a sudden, the following pages of one of my clients are reported as Duplicate content. I cannot understand why. They weren't before... http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal
Technical SEO | | MarketingEnergy
http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal
http://www.ciaoitalia.nl/product/pizza-originale/döner-halal
http://www.ciaoitalia.nl/product/pizza-originale/vegetariana
http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate
http://www.ciaoitalia.nl/product/pizza-originale/contadina
http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni
http://www.ciaoitalia.nl/product/pizza-originale/shoarma Thanks for any help in the right direction 🙂 | |
| |
| |
| |
| |
| |
| |
| | <colgroup><col style="mso-width-source: userset; mso-width-alt: 17225; width: 353pt;" width="471"></colgroup>
| http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/döner-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/vegetariana |
| http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate |
| http://www.ciaoitalia.nl/product/pizza-originale/contadina |
| http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni |
| http://www.ciaoitalia.nl/product/pizza-originale/shoarma |0 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
Noindex all dodgy content?
Hello should I be brutal with noindex? should I noindex anything of no value to websurfers? from my understanding, nofollow is different to to noindex? Google follows through the site crawling and discovering subpages but will not put the noindexed page in serps. Is that right? I have subcategory pages in a business directory site, these pages just have links to there subpages.
Technical SEO | | adamzski1 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0