Should I prevent Google from indexing blog tag and category pages?
-
I am working on a website that has a regularly updated Wordpress blog and am unsure whether or not the category and tag pages should be indexable.
The blog posts are often outranked by the tag and category pages and they are ultimately leaving me with a duplicate content issue. With this in mind, I assumed that the best thing to do would be to remove the tag and category pages from the index, but after speaking to someone else about the issue, I am no longer sure. I have tried researching online, but there isn't anything that provided any further information.
Please can anyone with any experience of dealing with issues like this or with any knowledge of the topic help me to resolve this annoying issue.
Any input will be greatly appreciated.
Thanks
Paul
-
Of course if you used the link juice you save by not indexing the tag pages towards your articles this "could" bring in just as much traffic or more than you lost by blocking those pages. I guess that is my question... does blocking out certain pages on your site put your sites link juice towards other pages?
-
I have decided to add content to the tag and category pages to make them more valuable and reduce the risk of duplicate content issues. The pages also only feature short snippets from the posts so hopefully this shouldn't be too much of a problem with the addition of fresh content.
Thanks
Paul
-
If most of your posts have the same tags and categories, your tag and category archives will look a lot each other and look like your main page which could lead to duplicate content issues. Many people choose either tags or categories to be indexed and the other blocked. If you can make your tag or category archive pages excerpt only, they will be good for navigation and not be as likely to look like duplicates. Like EGOL said, category pages can be a pretty big draw - especially when they are full of excerpts that are keyword-rich.
-
I used my robots.txt file to block my tag pages and they were removed from googles index as they should have been however I didn't notice a ranking boost. I curious what others have to say on this topic too.
-
Thanks!! Good answer! I'm going to leave the pages as they are.
Thanks
Paul
-
Find out how much traffic you would be walking away from if you block them from the index. My category pages bring in huge traffic - more than most websites receive. I sure would not block them. Also those pages attract a lot of links, likes, etc. If I block them from the SERPs those benefits would be gone.
My advice is to understand the website better.. Then make a decision based upon data.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I no-index categories of my blog?
I have blog with lots of articles & it also has lots of categories. These categories are currently indexed in the google and moz showing missing title and description for these categories. Should I place no-index tag in all the categories or leave it as it is?
Intermediate & Advanced SEO | | jhakasseo0 -
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
Intermediate & Advanced SEO | | akih0 -
Old pages STILL indexed...
Our new website has been live for around 3 months and the URL structure has completely changed. We weren't able to dynamically create 301 redirects for over 5,000 of our products because of how different the URL's were so we've been redirecting them as and when. 3 months on and we're still getting hundreds of 404 errors daily in our Webmaster Tools account. I've checked the server logs and it looks like Bing Bot still seems to want to crawl our old /product/ URL's. Also, if I perform a "site:example.co.uk/product" on Google or Bing - lots of results are still returned, indicating the both still haven't dropped them from their index. Should I ignore the 404 errors and continue to wait for them to drop off or should I just block /product/ in my robots.txt? After 3 months I'd have thought they'd have naturally dropped off by now! I'm half-debating this: User-agent: *
Intermediate & Advanced SEO | | LiamMcArthur
Disallow: /some-directory-for-all/* User-agent: Bingbot
User-agent: MSNBot
Disallow: /product/ Sitemap: http://www.example.co.uk/sitemap.xml0 -
Stub category pages (dupe warning)
Hi I have a number of highly ranked category pages. However, at times these contain no products for a few weeks, etc. They are being flagged as duplicate content as they are just stub pages when they have no products, with the same "No products found" message. I don't want to risk 'noindex' ing the pages though - because as soon as they have products in, they become valuable pages and I would hate to lose a good ranking. Should I just leave them as-is and ignore the dupe warnings?
Intermediate & Advanced SEO | | benseb0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Google Processing but Not Indexing XML Sitemap
Like it says above, Google is processing but not indexing our latest XML sitemap. I noticed this Monday afternoon - Indexed status was still Pending - and didn't think anything of it. But when it still said Pending on Tuesday, it seemed strange. I deleted and resubmitted our XML sitemap on Tuesday. It now shows that it was processed on Tuesday, but the Indexed status is still Pending. I've never seen this much of a lag, hence the concern. Our site IS indexed in Google - it shows up with a site:xxxx.com search with the same number of pages as it always has. The only thing I can see that triggered this is Sunday the site failed verification via Google, but we quickly fixed that and re-verified via WMT Monday morning. Anyone know what's going on?
Intermediate & Advanced SEO | | Kingof50 -
No index.no follow certain pages
Hi, I want to stop Google et al from finding a some pages within my website. the url is www.mywebsite.com/call_backrequest.php?rid=14 As these pages are creating a lot of duplicate content issues. Would the easiest solution be to place a 'Nofollow/Noindex' META tag in page www.mywebsite.com/call_backrequest.php many thanks in advance
Intermediate & Advanced SEO | | wood1e19680 -
Do I need a canonical tag on the 404 error page?
Per definition, a 404 is displayed for different url (any not existing url ...). As I try to clean my website following SEOmoz pro advices, SEOmoz notify me of duplicate content on urls leading to a 404 🙂 This is I guess not that important, but just curious: should we add a cononical tag to the template returning the 404, with a canonical url such as www.mysite.com/404 ?
Intermediate & Advanced SEO | | nuxeo0