"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
-
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex
We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
-
Technically that could be done in your robots.txt file but I wouldn't recommend that if you want Google to crawl them too. I'm not sure if Rogerbot can do that. Sorry I couldn't be more help.
If you don't get one of the staffers on here in the next few days, I would send a ticket to them for clarification.
If you decide to go with robots.txt here is a resource from Google on implementing and testing it. https://support.google.com/webmasters/answer/156449?hl=en
-
Thanks for the information on Rogerbot. I understand the difference between the bots from Google and Moz.
Some errors reported in Moz are not real. For example we use a responsive slider on the home page that generates the slides from specific pages. These pages are tagged to no-everything so as to be invisible to bots, yet they are generating errors in the reports.
Is there anyway to exclude some pages from the reports?
-
Don't forget that Rogerbot (moz's crawler) is a robot and not an index like Google. Google used robots to gather the data but the results we see is an index. Rogerbot will crawl the pages regardless of noindex or nofollow.
Here is more info on RogerBot http://moz.com/help/pro/rogerbot-crawler
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ECommerce Duplicate content on product pages (eg delivery info, contact details etc)
Hi, Running a Magento site and wanted to check about duplicate page content. We have 1000+ product pages and it has been suggested to remove some of the "duplicated content" which displays on every product page and replace this with an image of the same text content. By this I am talking about content which is for promo/customer purposes and is displayed on every page. eg: "If you find our products cheaper elsewhere then please click below to get your price match...... etc", and a chunk of text for the "Delivery Tab Information" and "Contact Tab Information" on each and every product page. A SEO company has suggested to turn this content into images. Does anyone have thoughts on this please?
On-Page Optimization | | Ampweb0 -
Duplicate page content
Hi Crawl errors is showing 2 pages of duplicate content for my clients WordPress site: /news/ & /category/featured/ Yoast is installed so how best to resolve this ? i see that both pages are canonicalised to themselves so presume just need to change the canonical tag on /category/featured/ to reference /news/ ?(since news is the page with higher authority and the main page for showing this info) or is there other way in Yoast or WP to deal with this & prevent from happening again ? Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
Duplicate Pages software
Hey guys, i was told few hours ago about a system that can take few of your keywords and automatically will create new links and pages (in the map file) for your website, so a website that was build with 20 pages( for example) will be shown to SE as a site with hundreds of pages, thing that should help the SEO IS anyone heard about such a software? is it legal? any advice that you can give on this mater? Thanks i.
On-Page Optimization | | iivgi0 -
Duplicate Home Page
Hi, I have a question around best practise on duplicate home pages. The /index.aspx page is showing up as a top referrer in my analytics. I have the rel=canonical tag implemented for the www.mysite.com on both pages. Do I need to 301 the /index.aspx to the mysite.com? I have a lot of links pointing to the /index.aspx (half of those are coming from the mysite.com). www.mysite.com/index.aspx www.mysite.com Many thanks Jon
On-Page Optimization | | JonRaubenheimer0 -
Duplicated Content with joomla multi language website
Dear Seomoz Community I am running a multi language joomla website (www.siam2nite.com) with 2 active languages. The first and primary language is english. the second language is thai. Most of the content (articles, event descriptions ...) is in english only. What we did is a thai translation for the navigation bars, headers, titles etc (translation of all joomla language files) those texts are static and only help the user navigate / understand our site in their thai language. Now I facing a problem with duplicated content. Lets take our Q&A component as example. the url structure looks like this: english - www.siam2nite.com/en/questions/ thai - www.siam2nite.com/th/questions/ Every question asked will create two URL, one for each language. The content itself (user questions & answers) is identical on both URL's. Only the GUI language is different. If you take a look at this question you will understand what i mean: ENGLISH VERSION: http://www.siam2nite.com/en/questions/where-to-celebrate-halloween-in-bangkok THAI VERSION: http://www.siam2nite.com/th/questions/where-to-celebrate-halloween-in-bangkok As you can see each page has a unique title (H1) and introduction text in the correct language (same for menu, buttons, etc.) but the questions and answers are only available in one language. Now my question 😉 I guess Google will see this pages as duplicated content. How should I proceed with this problem: put all thai links /th/questions/ in the robots.txt and block them or make a canonical tag for the english versions? Not sure if I set a canonical tag google will still index the thai title and introduction texts (they have important thai keywords in them) Would really appreciate your help on this 😉 Regards, Menelik
On-Page Optimization | | menelik0 -
Checking for content duplication against content on your own site.
We are currently trying to rewrite our product descriptions and I'm afraid some of the salespeople that are writing the descriptions are plagiarizing one-another's writing. Is there a content duplication checker that will allow you to check a piece of writing against a specific site rather than all of the web?
On-Page Optimization | | MichealGooden0 -
Duplicate pages
Hi, I am using a CMS that generates dynamic urls that according to the SeoMoz tool will be indexed as duplicate pages. The pages in questions are forms, blog-posts etc. that are not crucial to achieve ranking for. I do worry though about the consequences of having 20 (non-duplicate)pages with static urls and about 100 pages that are duplicates with dynamic urls. What consequences will this have for the speed that the robots crawl the site and could there be negative effects on ranking for the entire domain?
On-Page Optimization | | vibelingo0