Duplicate Content Dilemma for Category and Brand Pages
-
Hi,
I have a online shop with categories such as:
- Trousers
- Shirts
- Shoes
- etc.
But now I'm having a problem with further development.
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products.How do I deal with this from a duplicate content perspective?
I'm appreciate your suggestions.
Best, Robin
-
Wow. I did some research. I stand corrected. Thanks, Linda.
As far as your categories go, you could have:
www.Domain.com/computers/notebooks/apple-notebooks/
and
www.domain.com/apple-products/
On your category pages, I'd suggest adding unique content at the bottom of the category pages. A paragraph above the fold would help for ranking purposes, but may detract from usability and conversions.
-
Thank you for the time you've invested in this answer. This gives me a good sense on what to do.
I like option 2 the best. But not sure whether I got it right. What do you think of the following scenario (taking the Apple example you provided):
- I have a category page with notebooks. The title, description and text on this category is focusing on notebooks in general. The products include Dell, HP and Apple.
=> This is basically the setup, which I have in my shop right now. - Now I want to create a brand page for Apple. There the title, description and text is focused on Apple in general. The products include Apple Notebooks, iPhones, iPads ect.
Now here's the point: Title, description and text for the notebook category and Apple brand page will be different (unique content). But products are part of content too, aren't they? And since there will be a overlap in products, this would result in duplicate content for featured products.
But I want both pages to rank. One for 'notebook' and the other one for 'apple'.
Is that possible, or are partially overlapping products in those 2 categories a dealbreaker for my SEO?
- I have a category page with notebooks. The title, description and text on this category is focusing on notebooks in general. The products include Dell, HP and Apple.
-
According to Moz: "Another option for dealing with duplicate content is to utilize the rel=canonical tag. The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, and often takes much less development time to implement." http://moz.com/learn/seo/canonicalization
Why do you think it does not pass ranking power?
-
This is a difficult question. I would agree with patrick_g that canonicals are one way to handle duplicate content, but canonicals don't pass link juice to the parent, unless it's through a link.
The canonical tag only tells google which page to index. It does not transfer link juice as does a 301 redirect. Read up on this.
Here are some good choices:
1. if the brands are only for use experience purposes, you could make the pages noindex, follow. This would eliminate the duplicate content issue, and the brands could serve as a link juice hub. They would be kept out of google's index, but would still pass link juice.
2. Create unique content for the brand pages, and give them a title tag and content that differs from the competing page. For example if you already have an "Apple" page, make the new page "Certified Apple Products" (or some other KW).
3. This one requires some programming skill, and is a little controversial. Put the new pages in a parent folder "/hide/" (don't actually use the word "hide"). Put all the new pages in that folder, and disallow the parent folder in robots.txt. Any links on your site to these pages would pass link juice to these pages, which would be lost, and could be a significant link juice drain. Here's the controversial part - put the links to those pages in an iframe, and disallow the iframe folder in robots.txt. This would prevent bots from crawling those links and passing link juice.
-
Hello soralsokal,
I don't have a bunch of products like you, but generally I prefer to use the rel=canocial tag to push the link juice to the one category page I'm trying to get ranked well. So I still build out the various pages I want, but don't expect the duplicate content to be ranked.
I suppose you could also put them in your Robots.txt file to block them from search engines. I've done that with a blog that packed the blog posts in all sort of different categories, thus creating duplicate content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Duplicate Content - Classifieds (Panda)
I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.
Intermediate & Advanced SEO | | Sayers1 -
A lot of news / Duplicate Content - what to do?
Hi All, I have a blog with a lot of content (news and pr messages), I want to move my blog to new domain. What is your recommendation? 1. Keep it as is. old articles -> 301 -> same article different URL
Intermediate & Advanced SEO | | JohnPalmer
2. Remove all the duplicate content and create 301 from the old URL to my homepage.
3. Keep it as is, but add in the meta-tags NoIndex in duplicate articles. Thanks !0 -
How would you handle this duplicate content - noindex or canonical?
Hello Just trying look at how best to deal with this duplicated content. On our Canada holidays page we have a number of holidays listed (PAGE A)
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search. From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A. Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A? Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing: Duplicate Content: Block, Redirect or Canonical - SEO Tips0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Category Pages up - Product Pages down... what would help?
Hi I mentioned yesterday how one of our sites was losing rank on product pages. What steps do you take to improve the SERPS of product pages, in this case home/category/product is the tree. There isn't really any internal linking, except one link from the category page to each product, would setting up a host of internal links perhaps "similar products" linking them together be a place to start? How can I improve my ranking of these more deeply internal pages? Not just internal links?
Intermediate & Advanced SEO | | xoffie0 -
Duplicate URL home page
I just got a duplicate URL error on by SEOMOZ report - and I wonder if I should worry about it Assume my site is named www.widgets.com I'm getting duplicate url from http://www.widgets.com & http://www.widgets.com/ Do the search engines really see this as different on the home page? The general drift on the web is that You site should look like Home page = http://www.widgets.com And subpages http://www.widgets.com/widget1/ Of course it seems as though the IIS7 slash tool will rewrite everything Including the home page to a slash.
Intermediate & Advanced SEO | | ThomasErb0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0