How would you handle 12,000 "tag" pages on Wordpress site?
-
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
-
Yep, already implemented. Good point though.
-
Definitely. I start with the 30 day view, then go to YTD, then push the start date back to 1/1/2011. That's my 3 step process every time I'm investigating a situation.
I've seen at least 20 of our sites decline in traffic in the past few months due to the April & June Panda updates. The dates of decline in Webmaster Tools (Traffic > Search Queries) line up perfectly with the various recent Panda updates.
Fixing /tag/ issues is one thing...but we have a monumental task of rewriting massive amounts of product descriptions next. We also have a fair amount of "no-indexing" or canonicalizing to do with our syndicated content. We'll be better for it in the end. I only wish I knew about these situations much sooner.
As I tell everyone, protect your unique content with all you've got...and keep duplicate content nowhere near your site. It's just too risky.
-
Additionally, make sure your posts have rel=canonical.
-
Are you looking at your analytics as far back as early 2011?
I'm come across people who were hit on known Panda update day that weren't aware they were ....as strange as it may sound.
-
Thank you both...and, we're thinking alike. I recently went through our 60+ Wordpress sites addressing the issue of non-indexed /tag/ pages and also ensuring they weren't in the sitemap via our Sitemap plugin.
For the sites that had hundreds or thousands of /tag/ pages, but very little traffic in Google Analytics (Search > Organic w/ Landing Page as "primary dimension")...I just went ahead and set them to "noindex").
For sites where the /tag/ pages were driving a fair amount of traffic (10% of site total or more), I had our editors write unique descriptions for the top 50-100 (like we do with category pages) and then we set the rest to "noindex,follow" via the meta robots tag.
For this one site...I just haven't found an easy solution that didn't leave an uneasy feeling in my stomach. It's tough to give up 25% of your traffic in hopes that Google will get it right and rank your real content higher in place of these /tag/ pages.
Uh oh...I just checked Analytics and or organic traffic started creeping down @ July 13th. When I look at just the /tag/ pages in the organic landing pages section, I see that they dropped in traffic @ 50-60%. Something bad is happening. I am setting them to "noindex" immediately.
Definitely can't wait to read your post. I'll be writing my own on www.kernmedia.com in the near future as well.
-
Looking forward to that post, Dan.
-
Hi
I'm actually going to be addressing this exact question on a post for Moz in the coming weeks - so keep an eye out for that.
But in short, here's what I do;
Analytics
- run a report for landing tag pages (with a filter) - over the last three months
- apply an advanced segment to see google only traffic
- dump the report into a CSV
Webmaster Tools
- view a impressions / clicks report by top pages (not keyword) - also zoom out as far as you can
- filter for web only (not images)
- dump the report into a csv
VLookup in Excel
using a VLookup in excel - combine the two reports matching data to the URLs (you'll end up discarding some non-tag pages from wmt) - the end result will be a master spreadsheet, with the following columns;
- URL
- Impressions
- clicks
- avg position
- visits
- pages/visit
- avg visit duration
- % new visits
- bounce rate
(These are all the default report metrics. I actually prefer a custom landing page report in analytics, but this works fine.)
Analyze
Then, you do your sorting, filtering etc - to decide how valuable the tag traffic has been. In general, you're trying to look for an overwhelming reason for the value add of having those pages in there. they might get visits, but what's onsite behavior? maybe they get visits, but perhaps only from a small handle of tag pages?
In the post I do, I'll cover more about how to analyze this report etc.
As Klarke put so well, the actual posts should rank in their place. Those tend to have better results when people land on those.
Remove
If you decide to remove, do so carefully. Do it on a weekend or just before a downtime. If you use Yoast simply select to noindex tag archives.
Also, rememeber to exclude tags from your XML sitemap.
Then watch webmaster tools etc and watch for their removal.
--- I did this process on a site with 9,000 tag pages in the index and results were very good.
-Dan
-
I would "noindex,follow" them. Don't block them with robots.txt.
With that many pages, you're certainly running the risk of being hit by Panda.Those tag pages shouldn't be ranking, instead the individual posts should be in those positions. If I were you, I would take the chance and do the noindex, with the expectation that Google will appropriately rank the posts in their place.
I'd say those are better odds as against losing 50 - 80% of traffic in a panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup concerning category pages on an ecommerce site
We are adding json+ld data to an ecommerce site and myself and one of the other people working on the site are having a minor disagreement on things. What it comes down to is how to mark up the category page. One of us says it needs to be marked up with as an Itempage, https://schema.org/ItemPage The other says it needs to be marked up as products, with multiple product instances in the schema, https://schema.org/Product The main sticking point on the Itemlist is that Itemlist is a child of intangible, so there is a feeling that should be used for things like track listings or other arbitrary data.
Intermediate & Advanced SEO | | LesleyPaone2 -
Should I Re-direct Domains to Internal Pages on Money Site
I have an ecommerce site that is fully built out with thousands of products. I own many industry related domains for the products that i sell. Many of these domains are sitting unused. I started to think that it would beneficial if i 301 redirect (at the registrar level) these domains to their SPECIFIC subcategories on my main money site. For example, i sell sporting goods and my main website is buysportinggoods.com I also own the following domains: basketballoutlet.com & baseballequipmentstore.com & footballpads.com Would it be wise or foolish (and potentially cause a Google penalty) if i did the following: Point basketballoutlet.com to buysportinggoods.com/basketballs Point baseballequipmentstore.com to buysportinggoods.com/baseball Point footballpads.com to buysportinggoods.com/football Please let me know your thoughts or experiences with similar situations. Thanks!
Intermediate & Advanced SEO | | Prime850 -
How to handle individual page redirects on Wix?
I switched from one domain to another because I wanted a domain that had our company name so it was more brand-y. However, the old domain had better DA/PA. Originally I set up a global 301 from the old to the new, but now I'm finding that I actually need to set up individual 301's from each URL of the old site, or at least from each page. However, I am using Wix so it looks like I can't always do URL-URL 301's, although I can redirect any URL to a page on the new website. The problem is that, in some cases, the content on the new site is different (or, for example, I can only link a particular blog post on the old site back to the new site's blog's main page). How closely do URLS/pages need to resemble each other for link juice to be transferred? Also, should I try to set up all these redirects manually or bite the bullet and go back to using the old domain? The problem is that I did a lot of beginner SEO junk for the new domain, like submitting to a few higher-quality directories, and getting our website on various industry resource sites, etc. I'd need to re-do this entirely if I go back to the old page. What do you think?
Intermediate & Advanced SEO | | BohmKalish1230 -
Adding a Canonical Tag to each page referencing itself?
Hey Mozers! I've noticed that on www.Zappos.com they have a Canonical tag on each page referencing it self. I have heard that this is a popular method but I dont see the point in canon tagging a page to its self. Any thoughts?
Intermediate & Advanced SEO | | rpaiva0 -
Does Google View "SRC", "HREF", TITLE and Alt tags as Duplicate Content on Home Page Slider?
Greetings MOZ Community. A keyword matrix was developed by my SEO firm. I am in the process of integrating primary, secondary and terciary phrases into the text and am also sprinkling three or four other terms. Using a keyword density tool (http://www.webconfs.com/keyword-density-checker.php) the results were somewhat unexpected after I optimized. So I then looked at the source code and noticed text from HREF, ALT and SRC tags that may be effecting how Google would interpret text on the page. Our home page (www.nyc-officespace-leader.com) contains a slider with commercial real estate listings. Would Google index the SRC, HREF, TITLE and ALT tags in these slider items? Would this be detrimental to SEO? The code for one listing (and there are 7-8 in the slider) looks like this: | href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York">Class A Fifth Avenue Offices class="blockLeft"><a< p=""></a<> href="http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf" title="Lease a Prestigious Fifth Avenue Office - Manhattan, New York"> src="http://dr0nu3l9a17ym.cloudfront.net/wp-content/uploads/fsrep/houses/125x100/305.jpg" alt="Lease a Prestigious Fifth Avenue Office - Manhattan, New York" width="125" height="94" /> 1,340 Sq. Ft. $5,918 / month Fifth Avenue Midtown / Grand Central <a< p=""></a<> | Could the repetition of the title text ("lease a Prestigious Fifth...") trigger a duplicate content penalty? Should the slider content be blocked or set to no-index by some kind of a Java script? We have worked very hard to optimize the home page so it would be a real shame if through some technical oversight we got hit by a Google Panda penalty. Thanks, Alan Thanks
Intermediate & Advanced SEO | | Kingalan10 -
When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
I just want a second opinion 🙂 The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
Intermediate & Advanced SEO | | Zanox0 -
Does Google only look at LSI per page or context of the Site?
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words. For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans. Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
How to increase the page rank for keyword for entire site
sorry for my bad english is there any way to increase the ranking for a keyword for the entire site .i know that seo is done per page basis .my site contains 1000ds of posts and i cant get back links for each and every post .so i picked 4 keywords which are mostly used while searching my products , is there any method i can increase my ranking for those keywords like increasing domain authority EXAMPLE :like if i want to increase my ranking for "buy laptop" .if any user searches In google with buy laptop i want my site or any of related pages that match the user search query must show up in front
Intermediate & Advanced SEO | | prakash.moturu0