How to stop /tag creating duplicate content - Wordpress
-
Hi,
I keep getting alert for duplicate content. It seems Wordpress is creating it through a /tag
https://www.curveball-media.co.uk/tag/cipr/
https://www.curveball-media.co.uk/tag/pr-agencies/
Something in the way we've got Wordpress set up?
-
@pau4ner I'm with you, /tag/ pages likely aren't the end of the world. However, why not just robots.txt disallow them if someone was concerned?
-
@curveballmedia If you are using an On-page plugin such as Yoast SEO you can head to "Search Appearance > Taxonomy" and then look for Tags. There you can choose not to show tags on Search Results.
This will effectively noindex your tags archive page, so you don't have to fear of any duplicate content issues anymore (although I wouldn't worry too much about it, honestly).
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using same copy on different domain
I have a client that currently has a .com domain (not using hreflang) . They have a new partner in the UK and they want to replicate the website and use a .co.uk domain. It will be a different brand name. Will this cause any SEO issues?
Intermediate & Advanced SEO | | bedynamic0 -
Ok to have multiple domains w/ seperate websites rank for similar keywords?
Is it Ok to have multiple domains (separate website different content) rank for similar keywords? Is it 'OK' to have multiple domains in the following instance? Does Google actively discourage multiple (but completely different sites) domains from the same company appearing in the search results for the same and or similar keywords if the content is slightly different? This is where the 'main site' has the details, and you can purchase product, and the second site is a blog site only. We are creating a separate content blogsite; which would be on a second domain that will be related to one portion of content on main site. They would be linking back and forth, or maybe the blog site would just link over to the main site so they can purchase said product. This would be a similar scenario to give you an idea of how it would be structured: MAIN SITE: describes a few products, and you can purchase from this site SECOND SITE, different domain: a blog site that contains personal experiences
SEO Tactics | | fourwhitesocks
with one of the products. BOTH sites will be linked back and forth....or as mentioned maybe the blog site could just link over to the 'main site' Logo would be a modified version of the main logo and look and feel of the sight would be similar but not exactly the same. MORE INFO: the main site has existed for way over 10 years, starting to gain some traction in an extremely competitive market, but does not rank super high, is gaining traction due to improvements in speed, content, onpage SEO, etc... So in addition to my main question of is this 'ok' to have this second domain, also will it hurt the rankings or negatively affect the 'main' site, will we have duplicate content issues? ** If this is not the right place to ask this then where should I be asking?
Thank you!0 -
Non-standard HTML tags in content
I had coded my website's article content with a non-standard tag <cnt>that surrounded other standard tags that contained the article content, I.e.</cnt> , . The whole text was enclosed in a div that used Schema.org markup to identify the contents of the div as the articleBody. When looking at scraped data for stories in Webmaster Tools, the content of the story was there and identified as the articleBody correctly. It's recently been suggested by someone else that the presence of the non-standard <cnt>tags were actually making the content of the article uncrawlable by the Googlebot, this effectively rendering the content invisible. I did not believe this to be true, since the content appeared to be correctly indexed in Webmaster Tools, but for the sake of a test I agreed to removing them. In the last 6 weeks since they were removed, there have been no changes in impressions or traffic from organic search, which leads me to believe that the removal of the <cnt>tags actually had no effect, since the content was already being indexed successfully and nothing else has changed.</cnt></cnt> My question is whether or not an encapsulating non-standard tag as I've described would actually make the content invisible to Googlebot, or if it should not have made any difference so long as the correct Schema.org markup was in place? Thank you.
Technical SEO | | dlindsey0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
Why do I get duplicate content errors just for tags I place on blog entries?
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags. Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map. I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data. As seen in GWT: Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states
Technical SEO | | GerryWeitz0 -
Strange duplicate content issue
Hi there, SEOmoz crawler has identified a set of duplicate content that we are struggling to resolve. For example, the crawler picked up that this page www. creative - choices.co.uk/industry-insight/article/Advice-for-a-freelance-career is a duplicate of this page www. creative - choices.co.uk/develop-your-career/article/Advice-for-a-freelance-career. The latter page's content is the original and can be found in the CMS admin area whilst the former page is the duplicate and has no entry in the CMS. So we don't know where to begin if the "duplicate" page doesn't exist in the CMS. The crawler states that this page www. creative-choices.co.uk/industry-insight/inside/creative-writing is the referrer page. Looking at it, only the original page's link is showing on the referrer page, so how did the crawler get to the duplicate page?
Technical SEO | | CreativeChoices0