Should I noindex, nofollow a lot of child pages?
-
Some category pages on my website have A LOT of product pages in them. Many of these pages do not receive any sort of organic traffic. Is there a reason for Google to be crawling these pages? Should I noindex, nofollow these pages to make Google's life a little easier? Could I possibly see some benefits from this or should I leave it the way it is?
-
it all comes down to the amount of content on the pages. hyper targeted pages like sizes usually don't get enough editor attention or have enough unique content written about them to make them valuable and in that case a main page listing the sizes will suffice because they will rank for those sizes if the content is on the page because it is so extremely long tail...and protects you from panda. I just went through this exercise with a pool company who had several brands and multiple sizes and we opted to not get so granular.
-
Having pages that no one goes to won't negatively affect the rankings of the rest of your site.
In your case if you're really concerned about those obscure pages and they're thin or duplicate content, you could choose one really popular product in each category (such as "6x6x4 red foil gift box") and add canonical tags on the other similar products in that category (i.e. all the other red box pages) pointing to the popular one.
-
I would say leave them in there as you want Google to see everything on your site that might be relevant to the user.
I think you are referring to what some call Crawl Optimization or Crawl Budget (great article here http://www.blindfiveyearold.com/crawl-optimization) and yes there is something to making sure that you do not waste Google's time in crawling pages that do not matter.
I would still think that product pages are worth Google's time if you have good content and also, these are the things that you sell. Seems that Google would not only want to see your category pages, but what items are in a category. One thing to note, all of those "red box" product pages all link up to the "red box" category page. That is part of what makes the "red box" category authoritative within your site as you are telling Google this with your internal link structure. You may find that if you noindex your product pages, your category pages may go down.
The use of the noindex/nofollow to help with Crawl Optimization is really more for pages like search pages, or pages that can be resorted 100 different ways with 100 different URLs. Those are all duplicates and waste Googles time. Your product pages are really different animals and so my vote would be to keep them in the crawl.
-
Thanks for the response. The thing is these small product pages are too obscure for anyone to find. Many are differentiated only by product size.
For example:
If someone types in "red boxes" into google, we will rank for “red boxes”. On this category page, we have 24 different red box products such as "6x6x4 Red Foil Gift Box" and "5 1/2 x 11 1/2 x 1 Red Glossy Box." Sure, if somebody searched for a "6x6x4 Red Foil Gift Box," our page would show. But nobody is going to search that. There are ways to eliminate some of these extra pages, but it would be less user friendly for a customer clicking through the site.
My question is, if Google sees that these pages consistently get no organic traffic, even if they might rank for some obscure keyword, can keeping these pages have a negative effect on my rankings?
-
I would think you would want Google to find your product pages and then get you traffic for them. I dont think the solution is to use noindex as that would take them out of the index for sure.
I am betting that either due to your site archtecture or how you have your sitemap setup or even possibly that you have thin content on all the product pages are more of the issue.
If you don't want to work on any of those things, sure you can noindex all of your product pages, but then it just seems like you are giving up and limiting your long term outlook for ranking pages in Google.
The only reason I would use the noindex in a case like yours would be to keep duplicate product or category pages out of the index. Additionally, I would use that also to keep Google out of any of your search result pages, shopping cart etc. Those are the pages that are wasting Google's time. That brings up another point, are you having Google crawl a bunch of duplicate content on your site and that is why it never gets to the "good" content pages?
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I change our main category pages to product listing pages?
With the thought of improving user experience, as well as rankings in Google, I'm considering changing our main category pages to product listing pages (with sub-categories remaining, still). These main category pages are very standard and don't link to any informational content, such as buyers guides, etc. What's driven this is the latest Google core update. I've noticed our main competitor (who we were out-ranking before... but not now) now uses this approach. I can see the benefit from a user perspective, i.e. less clicks to reach products. What's the pros/cons from an SEO point of view, please? Could the potential duplication of content be an issue? For context, we have about 2,000 products and website is on Magento 2.
On-Page Optimization | | alifeofjoy1 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
What is the Impact of Canonical to a Canonical Page?
hey folks, How does google respond to this, canonical to a canonical page? i.e page A is canonical to Page which is already/also canonical to PAGE C. Thanks In advance AK
On-Page Optimization | | AnkammaRao0 -
Temporary Redirect pages
Hi, Temporary Redirect pages example when a non member goes to http://www.Somesite.com/detail/Username-Mike As he clicks the user names the user is directed to the login page http://www.Somesite.com/user/login We have 50K user accounts and 50K pages of content and each page has an option to comment and to comment user should be a member Moz campaing i get these 1,000's of links in Temporary Redirect page What is the action i can take thanks
On-Page Optimization | | mtthompsons0 -
Why is this page not ranking?
Can you please tell me why this page is not ranking. http://goo.gl/BqoRT The page doesn't rank at all for keywords but even if I copy a line or 2 of text it still doesn't rank for that text. Any help will be much appreciated.
On-Page Optimization | | JillB20130 -
Changing the url of a page
Hello. I would like to change the url of a page. It currently has very few inbound links. I would set up a 301 redirect to the new url. Is there anything else I should take into account before changing the url? Is there a downside to changing a url? Do inbound links carry the same value when a 301 redirect is involved? Thank you!
On-Page Optimization | | nyc-seo0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Source page leading to a 404 pages in reports
Hi everybody, I wonder how to find and quickly correct 404 errors in my crawl reports : SeoMoz says me "http://domain.com/this-page-is-dead" is 404, but I can't figure out a source page where a link to that url appears. I tried a google link:http://domain.com/this-page-is-dead request, with no more luck. I imagine the trick is trivial, but I need it 🙂 Moreover, why do not show a list of pages referring to this 404 page on reports ? Thanks, Loïc
On-Page Optimization | | mandinga0