Noindex duplicate content penalty?
-
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit.
Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site.
In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
-
Yes, I agree the ideal solution would be to make the content unique, however all being well, I will have about 20,000 threads and 50,000 posts added in a month. The other main reason for doing is it the forum script creates users as assigns posts to them so the forum will also seem to have about 5,000 active users.
Removing the duplicate content would be easy enough, can run an sql query and remove all posts before x date,
-
Do you really want to double your work? Parse and later remove forums content?
I think will be much better rewrite yahoo answers, of course it need more time and resources, but your content will be unique. And you've got search traffic much faster. It's ease to find cheap rewrites, who fill your forum very fast.
-
Maybe what you should do is add the rel="canonical" attribute on your page/thread to the corresponding Yahoo answers page. This will certainly tell Google who the "original owner" is. If you want to block from search engines also, keep the noindex and also block Googlebot in robots.txt for that sub directory.
-
Sorry, just thought of something else....
Instead of the no index would blocking google from the /forum/ directory in htaccess be even better? I'm guessing that it would. With noindex we are telling Google not to index the content but it is still reading it. With a block we are not even showing Google the bad content in the first place so it doesn't know there is any duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content when working with makes and models?
Okay, so I am running a store on Shopify at the address https://www.rhinox-group.com. This store is reasonably new, so being updated constantly! The thing that is really annoying me at the moment though, is I am getting errors in the form of duplicate content. This seems to be because we work using the machine make and model, which is obviously imperative, but then we have various products for each machine make and model. Have we got any suggestions on how I can cut down on these errors, as the last thing I want is being penalised by Google for this! Thanks in advance, Josh
Technical SEO | | josh.sprakes1 -
Duplicate content - font size and themes
Hi, How do we sort duplicate content issues with: http://www.ourwebsite.co.uk/ being the same as http://www.ourwebsite.co.uk/StyleType=SmallFont&StyleClass=FontSize or http://www.ourwebsite.co.uk/?StyleType=LargeFont&StyleClass=FontSize and http://www.ourwebsite.co.uk/legal_notices.aspx being the same as http://www.ourwebsite.co.uk/legal_notices.aspx?theme=default
Technical SEO | | Houses0 -
Duplicate Content on 2 Sites - Advice
We have one client who has an established eCommerce Site and has created another site which has the exact same content which is about to be launched. We want both sites to be indexed but not be penalised for duplicate content. The sites have different domains The sites have the same host We want the current site to be priority, so the new site would not be ranking higher in SERPs. Any advice on setting up canonical, author tags, alternate link tag etc Thanks Rich
Technical SEO | | SEOLeaders0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
Duplicate Content Resolution Suggestion?
SEOmoz tools is saying there is duplicate content for: www.mydomain.com www.mydomain.com/index.html What would be the best way to resolve this "error"?
Technical SEO | | PlasticCards0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0