How would you handle 12,000 "tag" pages on Wordpress site?
-
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
-
Yep, already implemented. Good point though.
-
Definitely. I start with the 30 day view, then go to YTD, then push the start date back to 1/1/2011. That's my 3 step process every time I'm investigating a situation.
I've seen at least 20 of our sites decline in traffic in the past few months due to the April & June Panda updates. The dates of decline in Webmaster Tools (Traffic > Search Queries) line up perfectly with the various recent Panda updates.
Fixing /tag/ issues is one thing...but we have a monumental task of rewriting massive amounts of product descriptions next. We also have a fair amount of "no-indexing" or canonicalizing to do with our syndicated content. We'll be better for it in the end. I only wish I knew about these situations much sooner.
As I tell everyone, protect your unique content with all you've got...and keep duplicate content nowhere near your site. It's just too risky.
-
Additionally, make sure your posts have rel=canonical.
-
Are you looking at your analytics as far back as early 2011?
I'm come across people who were hit on known Panda update day that weren't aware they were ....as strange as it may sound.
-
Thank you both...and, we're thinking alike. I recently went through our 60+ Wordpress sites addressing the issue of non-indexed /tag/ pages and also ensuring they weren't in the sitemap via our Sitemap plugin.
For the sites that had hundreds or thousands of /tag/ pages, but very little traffic in Google Analytics (Search > Organic w/ Landing Page as "primary dimension")...I just went ahead and set them to "noindex").
For sites where the /tag/ pages were driving a fair amount of traffic (10% of site total or more), I had our editors write unique descriptions for the top 50-100 (like we do with category pages) and then we set the rest to "noindex,follow" via the meta robots tag.
For this one site...I just haven't found an easy solution that didn't leave an uneasy feeling in my stomach. It's tough to give up 25% of your traffic in hopes that Google will get it right and rank your real content higher in place of these /tag/ pages.
Uh oh...I just checked Analytics and or organic traffic started creeping down @ July 13th. When I look at just the /tag/ pages in the organic landing pages section, I see that they dropped in traffic @ 50-60%. Something bad is happening. I am setting them to "noindex" immediately.
Definitely can't wait to read your post. I'll be writing my own on www.kernmedia.com in the near future as well.
-
Looking forward to that post, Dan.
-
Hi
I'm actually going to be addressing this exact question on a post for Moz in the coming weeks - so keep an eye out for that.
But in short, here's what I do;
Analytics
- run a report for landing tag pages (with a filter) - over the last three months
- apply an advanced segment to see google only traffic
- dump the report into a CSV
Webmaster Tools
- view a impressions / clicks report by top pages (not keyword) - also zoom out as far as you can
- filter for web only (not images)
- dump the report into a csv
VLookup in Excel
using a VLookup in excel - combine the two reports matching data to the URLs (you'll end up discarding some non-tag pages from wmt) - the end result will be a master spreadsheet, with the following columns;
- URL
- Impressions
- clicks
- avg position
- visits
- pages/visit
- avg visit duration
- % new visits
- bounce rate
(These are all the default report metrics. I actually prefer a custom landing page report in analytics, but this works fine.)
Analyze
Then, you do your sorting, filtering etc - to decide how valuable the tag traffic has been. In general, you're trying to look for an overwhelming reason for the value add of having those pages in there. they might get visits, but what's onsite behavior? maybe they get visits, but perhaps only from a small handle of tag pages?
In the post I do, I'll cover more about how to analyze this report etc.
As Klarke put so well, the actual posts should rank in their place. Those tend to have better results when people land on those.
Remove
If you decide to remove, do so carefully. Do it on a weekend or just before a downtime. If you use Yoast simply select to noindex tag archives.
Also, rememeber to exclude tags from your XML sitemap.
Then watch webmaster tools etc and watch for their removal.
--- I did this process on a site with 9,000 tag pages in the index and results were very good.
-Dan
-
I would "noindex,follow" them. Don't block them with robots.txt.
With that many pages, you're certainly running the risk of being hit by Panda.Those tag pages shouldn't be ranking, instead the individual posts should be in those positions. If I were you, I would take the chance and do the noindex, with the expectation that Google will appropriately rank the posts in their place.
I'd say those are better odds as against losing 50 - 80% of traffic in a panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag on a large site
when would you reccomend using a canonical tag on a large site?
Intermediate & Advanced SEO | | Cristiana.Solinas0 -
"WWW" versus non "WWW" on domain
We plan on migrating our site to a new shorter domain name. I like the idea of removing "www" to gain an additional 3 letters in the URL display. Is there any disadvantage of doing so from a technical or SEO perspective? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
My wordpress site generating bad links
Hello Everyone, I have wordpress site Which is from last 20 days generating links like For Example http://www.domainname.com/game/965/wiki/キャラクター図鑑_レアリティ(★★★)_【ID:675】ワッツ・ステップニー htttp://www.domainname.com/nkpghfu_13356_gvgjq_tfjhnkt_jsj_296_82566_673_567_245 This is screenshot of webmaster tools http://prnt.sc/ccwh0e can please any expert check & Tell How this Link i am getting, Also What are steps i need take for removing this Errors, As it is harming my sites Flow As well As Rankings. Thanx in Advance
Intermediate & Advanced SEO | | innovativekrishna10 -
301 migration - Indexed Pages rising on old site
Hello, We did a 301 redirect from site a to site b back in March. I would check on a daily basis on the index count using query "site:sitename" The past couple of days, the old domain (that was 301 redirected) indexed pages has been rising which is really concerning. We did a 301 redirect back in march 2016, and the indexed count went from 400k pages down to 78k. However, the past 3 days it went from 78k to 89,500. And I'm worried that the number is going to continue to rise. My question - What would you do to investigate / how to investigate this issue? Would it be screaming frog and look at redirects? Or is this a unique scenario that I'd have to do other steps/procedures?
Intermediate & Advanced SEO | | ggpaul5620 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
Google Panda question Category Pages on e-commerce site
Dear Mates, Could you check this category page of our e-commerce site: http://tinyurl.com/zqjalng and give me your opinion about, this is a Panda safe page or not? Actually I have this as NOINDEX preventing any Panda hit, but I'm in doubt. My Question is "Can I index this page again in peace?" Thank you Clay
Intermediate & Advanced SEO | | ClayRey0 -
What to do about similar product pages on major retail site
Hi all, I have a dilemma and I'm hoping the community can guide me in the right direction. We're working with a major retailer on launching a local deals section of their website (what I'll call the "local site"). The company has 55 million products for one brand, and 37 million for another. The main site (I'll call it the ".com version") is fairly well SEO'd with flat architecture, clean URLs, microdata, canonical tag, good product descriptions, etc. If you were looking for a refrigerator, you would use the faceted navigation and go from department > category > sub-category > product detail page. The local site's purpose is to "localize" all of the store inventory and have weekly offers and pricing specials. We will use a similar architecture as .com, except it will be under a /local/city-state/... sub-folder. Ideally, if you're looking for a refrigerator in San Antonio, Texas, then the local page should prove to be more relevant than the .com generic refrigerator pages. (the local pages have the addresses of all local stores in the footer and use the location microdata as well - the difference will be the prices.) MY QUESTION IS THIS: If we pull the exact same product pages/descriptions from the .com database for use in the local site, are we creating a duplicate content problem that will hurt the rest of the site? I don't think I can canonicalize to the .com generic product page - I actually want those local pages to show up at the top. Obviously, we don't want to copy product descriptions across root domains, but how is it handled across the SAME root domain? Ideally, it would be great if we had a listing from both the .com and the /local pages in the SERPs. What do you all think? Ryan
Intermediate & Advanced SEO | | RyanKelly0 -
Is there a way to stop my product pages with the "show all" catagory/attribute from duplicating content?
If there were less pages with the "show all" attribute it would be a simple fix by adding the canonical URL tag. But seeing that there are about 1,000 of them I was wondering if their was a broader fix that I could apply.
Intermediate & Advanced SEO | | cscoville0