Glossary index and individual pages create duplicate content. How much might this hurt me?
-
I've got a glossary on my site with an index page for each letter of the alphabet that has a definition. So the M section lists every definition (the whole definition).
But each definition also has its own individual page (and we link to those pages internally so the user doesn't have to hunt down the entire M page).
So I definitely have duplicate content ... 112 instances (112 terms). Maybe it's not so bad because each definition is just a short paragraph(?)
How much does this hurt my potential ranking for each definition? How much does it hurt my site overall?
Am I better off making the individual pages no-index? or canonicalizing them?
-
Thanks, Ryan!
-
From here: http://moz.com/messages/write to Dirk's username: DC1611. There used to be a button in profiles, but it looks like it got shuffled in the redesign.
-
PM? Does Moz offer that function?
-
It's a bit difficult to assess which of the pages is more important without knowing the site. Having a lot of content is good - but if the only link between the content is that they all start with the same letter it could be pretty weak or pretty strong depending on the situation:
I'll give 2 examples :
Suppose that the index is on First names starting with S - in this case this page is a valuable one because a lot of people are searching for it - and the search volume is potentially bigger than the number of people that are looking for first name steve (= one specific item)
Suppose the index is about Illnesses starting with S - in this case the index page has very little value for a searcher, because people are searching illnesses based the symptoms -the fact that illnesses start with S doesn't link them together.
It could be helpful if you send me the actual url's via PM if you don't want to disclose them here.
rgds
Dirk
-
Oops. Sorry. Poor wording there. Meant to say ...
Definitely not concerned that the M index page and the M* definition** page BOTH show up in the search results.
We definitely do want at least one of the pages to not only show up in the rankings, but to rank highly. I'm guessing the M index page would actually have a chance of ranking high because it will have so many long tails related to our short-tail.
But it would seem weird to put a no-index on the M* definition** page ... since we have multiple internal links to those pages.
Thanks again for your patience. Really appreciate the feedback.
Steve
-
That's exactly what I am saying - your index page with all the definitions is from Google perspective completely different from the detailed definition page (the first one being much richer in content than the 2nd one). If getting these pages ranked is the least of concerns - you can keep it as it is. If you want to play on the safe side, you can put a noindex on the index page.
rgds,
Dirk
-
Just having a bit of a dilemma. Trying to make it easier for people who come to the glossary and then go to ... say ... the M page. Don't have to keep clicking away to see the definitions. Result: More user-friendly
But we also want to have a very specific definition page so that when we link from an article to the definition, the user doesn't have to see all of the M definitions. Result: More user-friendly.
Definitely not concerned that both the M index page and the M* definition** page show up in the search results. That would actually be swell. Just more concerned that our overall site ranking or domain authority will somehow suffer.
If you're saying that the M index page and the M* page** are dramatically different (because the M index page is much, much longer) and so I shouldn't worry, that's great. (Hope that's what you're saying.)
Thanks!
-
Hi,
As far as I understand it's not really a question of duplicate content in the SEO meaning. Although all the definitions starting with M are on the M-index page this page is quite different to the pages that contain the individual definitions of the terms that start with M.
A problem on many sites is that the pages that only contain the explanation of one term are very light in terms of content, and that the page with is listing all these terms is generally not very interesting from a user (and search perspective). I don't know your site, so difficult to assess if this is the case
You could make the index page noindex/follow - and just list the terms, linking to the explanation pages. For the explanation pages which are probably the most interesting for users & search engines: try to enrich them by adding more content, like links to articles on your site that use the term, or have more information on the term
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Fix Duplicate Content Before Migration?
My client has 2 Wordpress sites (A and B). Each site is 20 pages, with similar site structures, and 12 of the pages on A having nearly 100% duplicate content with their counterpart on B. I am not sure to what extent A and/or B is being penalized for this. In 2 weeks (July 1) the client will execute a rebrand, renaming the business, launching C, and taking down A and B. Individual pages on A and B will be 301 redirected to their counterpart on C. C will have a similar site structure to A and B. I expect the content will be freshened a bit, but may initially be very similar to the content on A and B. I have 3 questions: Given that only 2 weeks remain before the switchover - is there any purpose in resolving the duplicate content between A and B prior to taking them down? Will 301 redirects from penalized pages on A or B actually hurt the ranking of the destination page on C? If a page on C has the same content as its predecessor on A or B, could it be penalized for that, even though the page on A or B has since been taken down and replaced with a 301 redirect?
Intermediate & Advanced SEO | | futumara0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Multiply domains and duplicate content confusion
I've just found out that a client has multiple domains which are being indexed by google and so leading me to worry that they will be penalised for duplicate content. Wondered if anyone could confirm a) are we likely to be penalised? and b) what should we do about it? (i'm thinking just 301 redirect each domain to the main www.clientdomain.com...?). Actual domain = www.clientdomain.com But these also exist: www.hostmastr.clientdomain.com www.pop.clientdomain.com www.subscribers.clientdomain.com www.www2.clientdomain.com www.wwwww.clientdomain.com ps I have NO idea how/why all these domains exist I really appreciate any expertise on this issue, many thanks!
Intermediate & Advanced SEO | | bisibee10 -
No index.no follow certain pages
Hi, I want to stop Google et al from finding a some pages within my website. the url is www.mywebsite.com/call_backrequest.php?rid=14 As these pages are creating a lot of duplicate content issues. Would the easiest solution be to place a 'Nofollow/Noindex' META tag in page www.mywebsite.com/call_backrequest.php many thanks in advance
Intermediate & Advanced SEO | | wood1e19680 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0