Optimizing unique pages with almost identical products?
-
Hi all,
I work in a space where we have many products (specifically components and transistors) that are almost identical minus one or two changes in voltage, die size, etc.
I'm curious if others in the tech / manufacturing space have had to tackle this issue? There are only so many ways to describe 5-6 products that are identical save one feature.
My gut tells me to offer up more head terms in the title tags to draw searchers and get specific in the meta descriptions, but I'm open to ideas.
-
Hi Jamie,
Thanks for the link. We're not on an ecommerce platform. Our product descriptions are written by our own team and we've made a lot of effort to make them unique, since even a small change in our products makes a big difference in the use case. The on-page content is less of an issue for us, I'm more worried about splitting our search influence by trying to target the same keywords across 5-6 pages. It helps to read that product descriptions are key though!
Thanks again for your help.
-
More Here:
With all things Google, it comes down to one factor; value. Duplicate content on the order of product descriptions and printer friendly pages is valuable. If you sell six products, all six have different pages, and forcing you to remove five of them is unfeasible. These pages are not uselessly scraped or intentionally copied for SEO purposes, they’re legitimate parts of your business.
The sites that actually receive duplicate content penalties are those that scrape content from other sites to pass off as its own. They are the sites that use duplicate content in an attempt to bypass search regulations. Google actively penalizes sites attempting to circumvent the rules.
http://www.seoblog.com/2014/05/duplicate-content-harmful-commerce-product-pages/
-
Hi Evan,
There is a good blog post on exactly this subject here:
http://www.seoblog.com/2014/07/product-descriptions-unique-duplicate-content/
Are you using an E-commerce platform?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
How to optimize this iphone app
Hello, We want to increase the downloads of and spread the word about this iPhone app: https://itunes.apple.com/us/app/dynamic-spin-release/id434567327?mt=8 What are your suggestions? Here's our two main websites in this case: nlpca.com dynamicSpinRelease.com There's also a promotion in the op right of this page: http://www.nlpca.com/DCweb/dynamicspinrelease6.html Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Optimize Pages for Keywords Prior to Building Links?
Greetings MOZ Community: According to site audit by a reputable SEO firm last November, my commercial real estate web site has a toxic link profile which is very weak (about 58% of links qualified as toxic). The SEO firm suggests than we immediately start pruning the link profile, requesting removal of the toxic links and eventually filing a link disavow file with Google for links that web masters will not agree to remove. While removing toxic links, the SEO firm proposes to simultaneously solicit very high quality links, to try to obtain 7-12 high quality links per month. My question is the following: is it putting the cart before the horse to work on link building without optimizing pages (with Yoast) for specific keywords? I would think that Google considers how each page is optimized for specific terms; which terms are used within the link structure, as well as terms within the meta tags. My site is partially optimized, but optimization has never been done thoroughly. Should the pages of the site be optimized for the top 25-30 terms before link building begins. Or can that be done at a later stage. Note that my link profile is pretty atrocious. My site at the moment is receiving about 1,000 unique visitors a week from organic search. However 70% of the traffic is from terms that are not relevant. The firm that did my audit claims that removal of the toxic links while building some new links is imperative and that optimization for keywords can wait somewhat. Any thoughts?/ Thanks for your assistance. Alan
Intermediate & Advanced SEO | | Kingalan10 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Page is noindex
Hi, We set pages with this and i can see in the view source of the page <meta name="robots" content="noindex"/> We had a new page posted in the site and its indexed by Google but now the new post is visible on a page thats shows partial data which we noindexed as above because its duplicate data and search engines dont have to see it But its still crawling Any ideas?
Intermediate & Advanced SEO | | mtthompsons0 -
Rel canonical on every page, pointing to home page
I've just started working with a client and have been surprised to find that every page of their site (using Concrete5 CMS) has a rel=canonical pointing to their home page. I'm feeling really dumb, because this seems like a fatal flaw which would keep Google from ranking any page other than the home page... but when I look at Google Analytics, Content > Site Content > Landing Pages, using Secondary Dimension = Source, it seems that Google is delivering users to numerous pages on their site. Can anyone help me out?! Thanks very much!!
Intermediate & Advanced SEO | | measurableROI0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
How to be a good SEO optimizer while competing with a good ranked Bad SEO optimizer?
My keywords are very competitive. My on page optimization report gives A grade for all the keywords I want to target to my Root domain. But my root domain does not show up on search engines for those same keywords. So thanks to SEOmoz i have managed to understand the place I lack is good link building. My competitors have done lot of link building through spamming, commenting on blogs, directories etc. Now according to good seo, this is not right. What do i do? I get digging more in it, i realized that i am getting traffic mostly for less globally searched keywords. But my competitors get high traffic from well searched keywords. How do i cope with such competition? Thanks
Intermediate & Advanced SEO | | MiddleEastSeo0