Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use old content from Archive
Hello all, I just were looking on Archive.org and though what if i use the content from a really old website thats already been un-indexed. Could i just use this content on my new website and will it not be duplicate content? Thank you for your comment! Kind regards,
Content Development | | MennoO0 -
Duplicate content penalty
Hi there, I'd like to ensure I avoid a duplicate content penalty and could do with some advice. There is a popular blogger in my industry. I have agreed to add his blog to my website. He currently posts his blog on one of the popular free blogger platforms, and will continue to do this. The issue is that I will be posting duplicate content onto my site and I want to ensure that I do not trigger a google penalty. Is there a simple way form me to inform Google of the original source of the content. My intitial thoughts are: 1. Add a noindex to the Robots.txt file 2. Add a link at the beginning of the article pointing to the original source 3. Adding a rel=canonical tag in the header of each blog entry pointing to the original blog post which resides on a completely different domain. Thanks DBC
Content Development | | DBC011 -
Use Of H1 Tags
I just have a quick question. I have seen a few people mention that the use of more than 1 H1 tag has little bearing on things. I am trying to sort out a site for a friend and it has 2 H1 tags. My issue is that it is ranking for a quite competitive keyword (don't know how as the whole site layout is a total mess) but in my efforts to clean up the site I am wondering if I should actually reduce the No of H1 Tags to 1 or leave 'as is' for fear of breaking something. Thanks Paul
Content Development | | propertyhunter0 -
Is modifying already published articles a problem?
Is there any problem updating the date and text of a news article if the URL remains the same? For example, if we published a minor sports story on Monday and on Wednesday there was a small update to the story I would like to take my original article and change the date + first paragraph of the text to make it still relevant for my users to read without having to write an entirely new article since this is not a major story and there is not much new content to add. Any SEO issues with take the original article and changing the first paragraph + title?
Content Development | | theLotter0 -
Duplicate Page Content WordPress blog with categories?
Just got a crawl report back from SEOmoz and it gives me lots of errors for "duplicate page content". Upon investigating, I notice this is because my WP blog is setup into categories so the home page is almost identical to one of the category pages. None of my actually posts are the same but the category pages have some overlap since the same post could show up in two or more categories. Is this a problem or can I just ignore this error? Any thing I should be doing differently? Thanks!
Content Development | | frankthetank20 -
Duplicate content because of tag
Dear Expert, I use several tags in my single post (www.rumahapp.com), for example when i post something about new game release i would use these tags: Publisher name, game rating and game genre. The problem is seomoz crawl my page and said that i have duplicate content between http://www.rumahapp.com/tag/com2us/ and http://www.rumahapp.com/tag/homerun-battle-2 how do i solve this? is this mean that i shouldn't use tag in my post? thank you
Content Development | | Gundud0 -
Does anyone know if a large forum can impact your seo rating on specific terms?
I have a large forum and I'm trying to figure out how to leverage it for SEO. Hopefully it'll help. Any advice? Does google ignore forums?
Content Development | | RamseySolutions0 -
Help with Duplicate Content Issue for pages...
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
Content Development | | pauledwards
There are about 300 pages affected by duplicate content currently. Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index? The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google. Any advice much appreciated. Kind Regards,0