Should we deindex duplicate pages?
-
I work on an education website. We offer programs that are offered up to 6 times per year. At the moment, we have a webpage for each instance of the program, but that's causing duplicate content issues. We're reworking the pages so the majority of the content will be on one page, but we'll still have to keep the application details as separate pages. 90% of the time, application details are going to be nearly identical, so I'm worried that these pages will still be seen as duplicate content.
My question is, should we deindex these pages? We don't particularly want people landing on our application page without seeing the other details of the program anyway. But, is there problem with deindexing such a large chunk of your site that I'm not thinking of?
Thanks, everyone!
-
I am not sure there is an easy answer.
I would calculate your losses on the de-indexation
I would OSE the URL's in question (and verify through webmaster tools) See how many external sites point to them.
If you do not have many incoming links, then I am not sure de-indexation is so bad, comparatively to the man hours or work needed to keep this up now and in the future.
I think it would just become almost like a Profit and Loss (P&L) scenario.
Also if you are already seeing duplicate content penalty's, in the grand scheme it may help to get rid of those pages as post PANDA it MIGHT be dragging you down.
But really this is all speculation, as I am not fully "up to speed" with the exact situation, and the answer in my opinion would be based on alot of factors not just SEO.
Hope this helps!
Shane
-
I definitely looked in to using the canonical tag, but the question is, which version of the application page would they all point to? The application pages are going to be constantly rotated out, so there isn't one version that will always be live, and asking our web editors to change that canonical tag once or twice a year for 300+ programs wouldn't go over well.
-
In my personal opinion I would not de-index them, use canonical
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
Here is a good take on it from a past Q & A Post
http://www.seomoz.org/q/best-way-to-handle-different-views-of-the-same-page
If you are not aware what a canonical is =
It is a tag you place in the head of your page that tells search engines to look to another page, due to this page is a duplicate. You point to the page you want Search Engines to Index as the "master" page
Hope this helps
w00t!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
Duplicate Content Dilemma for Category and Brand Pages
Hi, I have a online shop with categories such as: Trousers Shirts Shoes etc. But now I'm having a problem with further development.
Intermediate & Advanced SEO | | soralsokal
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products. How do I deal with this from a duplicate content perspective? I'm appreciate your suggestions. Best, Robin0 -
Home page not being indexed
Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear - I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent
Intermediate & Advanced SEO | | EchelonSEO0 -
Duplicate keyphrases in page titles = penalty?
Hello Mozzers - just looking at a website which has duplicate keyphrases in its page titles... So you have [keyphrase 1] | [exact match Keyphrase 1] Now I happen to know this particular site has suffered a dramatic fall in traffic - the SEO agency working on the site had advised the client to duplicate keyphrases. Hard to believe, huh! What I'm wondering is whether this extensive exact match keyphrase duplication might've been enough to attract a penalty? Your thoughts would be welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Create different pages with keyword variations VS. Add keyword variations in 1 page
For searches involving keywords like "lessons", "courses", "classes" I see frequently pages in the top rankings which do not contain the search term in the title tag, despite these terms being quite competitive. It seems that when searching for "classes", google detects that pages about "courses" may be just as relevant. What do you recommend? option 1: creating 10 pages optimized on 10 different keyword variations, each with a significant part of unique content or option 2: one page and dropping throughout the page 10 keyword variations in body and headlines Given that keywords are all synonyms and website has already high domain authority in the niche. thanks
Intermediate & Advanced SEO | | lcourse0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0