Will duplicate product information paragraphs negatively impact our site?
-
We are selling paint and have separate pages for different colour cans, each with their own unique description.
We would like to include a few additional paragraphs of product information below each description, but this will be identical across all the products. Do you think this will be a problem being duplicate content?
-
I wouldn't say there would be massive chances of a penalty here, that being said it's an area where you could be 'adding value' and uniqueness to your pages and you're not doing it. So your pages may be 'less competitive' and you may be missing out on an opportunity. It's more of a competitive missed opportunity than an 'error' per-se
In reality you should have one product page for each product and then just have 'product variants' for stuff like quantity, size, colour etc. On the modern web people find this easier to navigate and since many sites do offer that, they might seem like more competitive places to shop for paint cans than your site. Price does matter, but it's not the sole arbiter of how products are ranked on Google's search engine - other stuff matters too. Unless you have a virtual monopoly on the product (only you can sell it, or only you can sell it at a greatly discounted price due to a special relationship with the supplier) then I would consider the UX and design of your site. No one wants an 'arse-ache' of a browsing experience
Many tools will flag what you are about to do as duplicate content and they're technically right. But instead of going on some crazy copy-writing crusade, think about the architecture of your site. You can still have separate URLs for different product variations if you want, even via parameter-variables (though that's a bit of a 'basic' implementation). If you make it clear to Google through new, more streamlined architecture that they're all actually the same product, the duplicate description(s) won't matter 'as much' (though they'll still be a missed opportunity for more diverse rankings IMO)
You can make it even more apparent to Google that all the different variations are actually the 'same product' by utilising Product schema and some of the deeper stuff like ProductModel which will bind it all together. Whatever you implement, test it here. If this tool throws errors and warnings, keep working away until they're all fixed
Canonical tags are another option but they will decrease your ranking 'footprint' and in this case I wouldn't recommend them, despite 'slight' content duplication risk (which in reality, are mostly negligible)
Final note: you say you have 'unique' descriptions, but remember if they're used elsewhere online they're not unique. If they're unique internally that's great, but if you got them all from a supplier then... obviously loads of other sites are probably using them, which could easily be a big issue for you
-
Hi Justin,
Great question, to help answer that question I will use a quote from Google's support document regarding duplicate content.
https://support.google.com/webmasters/answer/66359?hl=en
"Examples of non-malicious duplicate content could include:
- Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
- Store items shown or linked via multiple distinct URLs
- Printer-only versions of web pages
"
I think your situation would likely fall under the similar category as "acceptable" like the store items example I highlighted. Keep in mind although duplicate content should really be avoided when possible, Google does NOT actually penalize site's for having it.
Although I would try to keep the overall amount of duplicate content to a minimum, it shouldn't be too big of an issue. Utilize the unique descriptions, in this case, you likely won't have to worry too much about the duplicate content.
I hope that helps!
Best,
Alex Ratynski -
Hi Joe,
Thanks for your help, it would probably be about 50%, but we could look to make this more like 80% unique content if you think this will help.
-
Hello,
How much of the copy is unique per page?
WRT to content originality, I've worked to is 80% unique content per page as a general rule.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from long Site Title
Hello! I have a number of "Duplicate Title Errors" as my website has a long Site Title: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning. Am I better off with a short title that is simply my website/business name: Planit NZ My thought was adding some keywords might help with my rankings. Thanks Matt
Technical SEO | | mkyhnn0 -
Does my "spam" site affect my other sites on the same IP?
I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it. While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me. Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues? Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
Technical SEO | | eglove0 -
Sites for English speaking countries: Duplicate Content - What to do?
HI, We are planning to launch sites specific to target market (geographic location) but the products and services are similar in all those markets as we sell software.So here's the scenario: Our target markets are all English speaking countries i.e. Britain, USA and India We don't have the option of using ccTLD like .co.uk, co.in etc. How should we handle the content? Because product, its features, industries it caters to and our services are common irrespective of market. Whether we go with sub-directory or sub-domain, the content will be in English. So how should we craft the content? Is writing the unique content for the same product thrice the only option? Regards
Technical SEO | | IM_Learner0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
If you are organizing the site structure for an ecommerce site, how would you do it?
Should you use not use slashes and use all dashes or use just a few slashes and the rest with dashes? For example, domain.com/category/brand/product-color-etc OR domain.com/anythinghere-color-dimensions-etc Which structure would you rather go for and why?
Technical SEO | | Zookeeper0 -
Why is there duplicates of my domain
When viewing crawl diagnostics in SEOmoz I can see both "www.website.com" and a truncated version "website.com" is this normal and why is it showing (I do not have duplicates of my site on the server)? E.g.: http://www.klinehimalaya.com/
Technical SEO | | gorillakid
http://klinehimalaya.com/0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
If googlebot fetch doesnt find our site will it be indexed?
We have a problem with one of our sites (wordpress) not getting fetched by googlebot. Some folders on the url get found others not. So we have isolated it as a wordpress issue. Will this affect our page in google serps anytime soon? Does any whizz kid out there know how to begin fixing this as we have spent two days solid on this. url is www.holden-jones.co.uk Thanks in advance guys Rob
Technical SEO | | wonderwall0