Manage category pages and duplicate content issues
-
Hi everybody,
I am now auditing this website www.disfracessimon.com
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion.When I was checking parts of the content in google to find duplicate content issues I found this:
I google I searched: "Chaleco de streck decorado con botones" and found
First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in.I was wondering if this could cause any problem related with duplicated content.
Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google.Any advice?
Thank you
-
Thank you Gianluca for your time and answer!
-
Didn't you notice that the paginated categories do not have the rel prev/next implemented? That could explain why page 2 and similar are indexed.
In the specific case of a site like yours, I would recommend to implement the rel prev/next (http://googlewebmastercentral.blogspot.com.es/2011/09/pagination-with-relnext-and-relprev.html), because the only indexable paginated lists are main categories (eg: Adultos, Infantiles, Halloween...), while the subcategories and facets cannot be indexed because of the use of the hashbang ("#") in their URLs.
In general, though, if a site presents hundreds if not thousand of URLs as consequence of pagination (category > subcategory > sub-subcategory / indexable facets), then I would look first the logs so to see if the bots are spending big part of their time crawling those paginated URLs and not so much the product pages.
If that was the case, I usually suggest to go for a mixed strategy:
- Let index the main categories, implementing the rel/prev tags;
- Let index only the first page (the organic landing page) of all the others paginated lists, not using the rel prev/next and using the "noindex,nofollow" meta robots from page 2+
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicating relevant category content in subcategories. Good or bad for google ranking?
In a travel related page I have city categories with city related information.
Intermediate & Advanced SEO | | lcourse
Would you recommend for or against duplicating some relevant city related in subcategory pages. For visitor it would be useful and google should have more context about the topic of our page.
But my main concern is how this may be perceived by google and especially whether it may make it more likely being penalized for thin content. We already were hit end of june by panda/phantom and we are working on adding also more unique content, but this would be something that we could do additionally and basically instantaneously. Just do not want to make things worse.0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
How can we improve rankings for category pages
Hi Everyone, I have a dog breeder site I'm working on and I was wondering if I could get some tips and ideas on things to do to help the "category" pages rank better in search engines. Let's say I have "xyz" breed category page which has listings of all dog breeders who offer that particular breed, in this case "xyz". I have certain breeder profile listings which rank higher for those terms that the category page should be ranking for. So I'm guessing Google thinks those breeder profile pages are more relevant for those terms. Especially if well optimized. I know thin content may be my problem here, but one of our competitors dominates the rankings for relevant keywords with no content on their category pages. What do you all suggest?
Intermediate & Advanced SEO | | rsanchez0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0 -
Duplicate URL home page
I just got a duplicate URL error on by SEOMOZ report - and I wonder if I should worry about it Assume my site is named www.widgets.com I'm getting duplicate url from http://www.widgets.com & http://www.widgets.com/ Do the search engines really see this as different on the home page? The general drift on the web is that You site should look like Home page = http://www.widgets.com And subpages http://www.widgets.com/widget1/ Of course it seems as though the IIS7 slash tool will rewrite everything Including the home page to a slash.
Intermediate & Advanced SEO | | ThomasErb0