Duplicate Content Product Descriptions - Technical List Supplier Gave Us
-
Hello,
Our supplier gives us a small paragraph and a list of technical features for our product descriptions.
My concern is duplicate content.
Here's what my current plan is:
1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content.
2. To reword the technical descriptions (though this is not always possible)
3. To have a custom H1, Title tag and meta description
My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us?
Or do we need to rewrite every technical list?
Thanks.
-
Andy, I think this is spot on. I will focus on the value I can add. Thank you.
-
Hi Bob,
There are some things that Google understands that can't be changed. If there is a technical spec for a product, I wouldn't be trying to change that because it will probably change the product as well.
With any ecom site, try to focus on the value-add - what can you bring to a page that offers something more than the stock information that you are given.
-Andy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Increase US traffic for other GEO based website
Can you help me to understand why my traffic from the US is not increasing for landing pages/product pages whereas for our blog it has grown 2X to 3X past 2 quarters? I am afraid that I don't get any right answer for this. Could you or someone help me to discover the answer? Also, what should I do to rank in the US for a particular KW, we rank on 2 positions for KW "Hackathon" in India but for the US the no is 56. I don't understand what to do and the best possible way to rank in the US.
White Hat / Black Hat SEO | | Rajnish_HE0 -
Social engineering content detected
hello, i have Got Social engineering content detected Message on webmaster tools on my around 20 sites, i have checked on server cleared, all unnecessary folders, But still i am not getting rectified this issue. One more error i got is Remove the deceptive content, But there is no any content on website which can harm my site, so kindly help & tell us steps we need take to resolve this issue, i am facing it from 10 days, yet not able to resolve, thnx in advance
White Hat / Black Hat SEO | | rohitiepl0 -
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Secretly back-linking from whitelabel product
Lets say a company (provider.com) offers a whitelabel solution which enables each client to have all of the content on their own domain (product.client.com), with no branding by the content provider. Now lets say that client.com is a site with a lot of authority, and to promote the launch of product.client.com, they put a lot of links from their main site to the subdomain. This can be very valuable link juice, and provider.com would like to be able to take advantage. The problem is, that client.com wouldn't like it if provider.com put in links on their whitelabel site. Suppose the following: All pages on product.client.com start to have a rel="canonical" link to themselves, with a get variable (e.g. product.client.com/page.htm -> product.client.com/page.html?show_extra_link=true) When the page is visited with the extra get parameter "show_extra_link" a link appears in the footer that points to provider.com My question is, would this have the same effect for provider.com as placing a link on the non-canonical version of the pages on the whitelabel site would?
White Hat / Black Hat SEO | | seoczar0