Category Pages For Distributing Authority But Not Creating Duplicate Content
-
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure.
I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct?
Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages.
Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out.
Thanks! Best.. Mike
-
Thanks, Jane! I really appreciate it.
If the now noindexed category pages have already been indexed, do you think I should request removal from the index as well?
Best... Mike
-
"I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct?"
This will achieve the goal, yes. You would ideally include noindex, follow (as opposed to nofollow) in the meta tag of the page you want to exclude. This means that Google crawls the page in full and allows PageRank to flow from that page to the pages it links to, but doesn't include any of the page's content or its URL in the index.
If you exclude the page via robots.txt, Google never crawls the page at all. You sometimes see URLs whose pages have been excluded via robots.txt showing up in Google's index, because robots.txt doesn't say "don't index this URL"; it simply says "don't crawl it." That's also why excluding a page in robots.txt and putting a noindex meta tag on the page would be redundant - Google would never see the noindex tag because it would never crawl the page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shall we add engaging and useful FAQ content in all our pages or rather not because of duplication and reduction of unique content?
We are considering to add at the end of alll our 1500 product pages answers to the 9 most frequently asked questions. These questions and answers will be 90% identical for all our products and personalizing them more is not an option and not so necessary since most questions are related to the process of reserving the product. We are convinced this will increase engagement of users with the page, time on page and it will be genuinely useful for the visitor as most visitors will not visit the seperate FAQ page. Also it will add more related keywords/topics to the page.
Intermediate & Advanced SEO | | lcourse
On the downside it will reduce the percentage of unique content per page and adds duplication. Any thoughts about wether in terms of google rankings we should go ahead and benefits in form of engagement may outweight downside of duplication of content?0 -
Duplicate Content Errors new website. How do you know which page to put the rel canonical tag on?
I am having problems with duplicate content. This is a new website and all the pages have the same page and domain rank, the following is an example of the homepage. How do you know which page to use the canonical tag on? http://medresourcesupply.com/index.php http://medresourcesupply.com/ Would this be the correct way to use this? Here is another example where Moz says these are duplicates. I can't figure out why because they have different url's and content. http://medresourcesupply.com/clutching_at_the_throat http://medresourcesupply.com/index.php?src=gendocs&ref=detailed_specfications &category=Main
Intermediate & Advanced SEO | | artscube.biz0 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
Using author on every page of website?
I'm currently get to grips with schema and one thing im using is author on my blog posts and seeing my photo etc on organic searches which are related. I see one of my competitors is using author on every page on their website, not just blog posts etc. Are there any recommendation when it should be used? Should it be site wide or is it really intended for blog posts etc? Would it be wrong for me to use on every page of my website as one of my businesses is myself as a lone person? This is what you get when searching for driving lessons in just about any town! https://www.google.co.uk/#gs_rn=15&gs_ri=psy-ab&tok=LS_DOrAHswmHC9_8AJZEJA&suggest=p&pq=driving instructor brighton&cp=20&gs_id=1k2&xhr=t&q=driving+lessons+crawley&es_nrs=true&pf=p&sclient=psy-ab&oq=driving+lessons+craw&gs_l=&pbx=1&bav=on.2,or.r_cp.r_qf.&bvm=bv.47244034,d.d2k&fp=45c2f917e11bca99&biw=1680&bih=843 Any comments welcome! Antony
Intermediate & Advanced SEO | | Ant710 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
Content that is split into 4 pages, should I consolidate?
I am working on improving a website that has each section split into four pages. For example, if Indonesia Vacation was a section, it would have its main page, www.domain.com/indonesia-vacation, and the about, fact sheet, and tips on three other pages www.domain.com/indonesia-vacation-1 www.domain.com/indonesia-vacation-2 www.domain.com/indonesia-vacation-3 The pages share very similar title tags and I am worried it is hurting the main page for placement.. So to conserve link juice, would it make sense to have them all one page? There is not so much content that it would affect load time. My strategy would be to have all content available and part of the main page and 301 the three URL's back to the main page: www.domain.com/indonesia-vacation Any insight would be greatly appreciated!!!
Intermediate & Advanced SEO | | MattAaron0 -
Duplicate content ramifications for country TLDs
We have a .com site here in the US that is ranking well for targeted phrases. The client is expanding its sales force into India and South Africa. They want to duplicate the site entirely, twice. Once for each country. I'm not well-versed in international SEO. Will this cause a duplicate content filter? Would google.co.in and google.co.za look at google.com's index for duplication? Thanks. Long time lurker, first time question poster.
Intermediate & Advanced SEO | | Alter_Imaging0