Duplicate Content in WordPress Taxonomies & Noindex, Follow
-
Hello Moz Community,
We are seeing duplicate content issues in our Moz report for our WordPress site’s Tag pages. After a bit of research, it appears one of the best solutions is to set Tag pages to “no index, follow” within Yoast. That makes sense, but we have a few questions:
In doing this, how are we affecting our opportunity to show up in search results?
Are there any other repercussions to making this change?
What would it take to make the content on these pages be seen as unique?
-
Hi Corey
Don pretty much nailed this answer. I'll make sure to answer your specific questions:
-
You're not negatively affecting your ability to show in search by noindexing tags. They almost never rank or get traffic since they are just pages of thin content. Noindexing them does not affect your other pages.
-
No other repercussions - unless you have a random tag archive getting traffic. Check your analytics, and if so, you can individually leave specific tags indexed with the Yoast plugin. I wrote a post all about this a while back: http://www.evolvingseo.com/2012/08/10/clean-sweep-yo-tag-archives-now/
-
You would have to figure out how to have unique title tags on each page - which is honestly more effort than is worth it. The Moz tool is showing an 'error' of duplicate content but the issue is just more that tag pages don't have much value and can just be noindexed and then ignored.
-
-
Hey there. Have a look at this post please. I believe it has everything you'll need to know. Thanks again for the help, Don!
https://mza.seotoolninja.com/community/q/different-wp-taxonomies-seen-as-duplicate-content
-
The problem you are experiencing is due to archives pages that are created every time you create a new tag, category, author, or other types of archive pages. The issue occurs when you tag or categorize the post or page. For instance, "Why Do We Need SEO" is your first ever post on your website and you tag it with SEO Best Practices, and you categorize it has SEO. You will have 3 archive pages with duplicate content. The author page, the SEO Best Practices page and the SEO page. This is because each archive page only consists of the same post "Why Do We Need SEO". So as you write more posts the duplicate pages may disappear depending on how you organize your content. If you create lots of tags and tag everything chances are you will always have duplicate content.
To not get a penalty you should no index these archive pages. But if you are disciplined with your organization of tags and categories. You will not have a problem.
Thanks,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content nightmare
Hey Moz Community I ran a crawl test, and there is a lot of duplicate content but I cannot work out why. It seems that when I publish a post secondary urls are being created depending on some tags and categories. Or at least, that is what it looks like. I don't know why this is happening, nor do I know if I need to do anything about it. Help? Please.
Moz Pro | | MobileDay0 -
Why is my internal followed links is 0?
My internal followed links is 0. Also for top pages the report say "block by robots.txt" but the Moz bot is able to crawl my site and generate report for 10K pages. Please help me understand why. http://www.opensiteexplorer.org/comparisons?site=www.findyogi.com
Moz Pro | | namansr0 -
Forward slash on URL on Duplicate Content Report
Hi I'm new to this whole Moz thing, so needing help from some kind people! I've just looked at my Duplicate Page Content report and there are loads of URLs in there which are the same but are just differentiated by adding / at the end of the URL, e.g. http://youngepilepsy.org.uk/news-and-events/events http://youngepilepsy.org.uk/news-and-events/events/ Is this be a canonical issue? I can't understand why though as these aren't at the root. However when we add inline text links within the page HTML, there are some URLs with / and some without, could that be the reason? Thanks for your help! Jackie
Moz Pro | | YoungEpilepsy1 -
Why does SEOMoz think I have duplicate content?
The SEOmoz crawl report shows me a large amount of duplicate content sites. Our site is built on a CMS that creates the link we want it to be but also automatically creates it's own longer version of the link (e.g. http://www.federalnational.com/About/tabid/82/Default.aspx and http://www.federalnational.com/about.aspx). We set the site up so that there are automatic redirects for our site. Google Webmaster does not see these pages as duplicate pages. Why does SEOmoz consider them duplicate content? Is there a way to weed this out so that the crawl report becomes more meaningful? Thanks!
Moz Pro | | jsillay0 -
In my crawl diagnostics, there are links to duplicate content. How can I track down where these links originated in?
How can I find out how SEOMOz found these links to begin with? That would help fix the issue. Where's the source page where the link was first encountered listed at?
Moz Pro | | kirklandsl0 -
How do I fix a duplicate content error with a top level domain?
Hi, I'm getting a duplicate content error from the SEOmoz crawler due to an issue with trailing slashes. It's showing www.milengo.com and www.milengo.com/ as having duplicate page titles. However I'm pretty sure this has been fixed in the .htaccess file since if you type in the domain with a trailing slash it automatically redirects to the domain without a trailing slash, so this shouldn't be an issue. I'm stuck here. Any ideas? Thanks. Rob
Moz Pro | | milengo0 -
Duplicate page content and search in Magento
Hi all, Firstly, I am a business owner and not a SEO genuis but I work on my site and am learning how to "tweek" everyday. That said, my site www.vintagetimes.com.au needs a bit more than a tweek. Here is problem 1: I have massive duplicate page content which is being driven primarily by search and I'm not sure how to tackle the issue. Working in Magento. Could anybody give me an instruction on how to steer robots away from search results? I would also like to know WHY a search result is here as well? Example of about 20 pages of this type of result: | Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=created_at&dir=asc 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=metal&dir=asc 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=name&dir=asc 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=price&dir=asc 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=relevance&dir=asc 50+ 1 0 Search results for: '1 carat' Vintage Times http://www.vintagetimes.com.au/catalogsearch/result/index/?q=1+carat&enable_googlecheckout=1&cat=21&order=stone&dir=asc | 50+ | 1 | 0 |
Moz Pro | | VintageTimesAustralia0 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0