Product Variations (rel=canonical or 301) & Duplicate Product Descriptions
-
Hi All,
Hoping for a bit of advice here please, I’ve been tasked with building an e-commerce store and all is going well so far.
We decided to use Wordpress with Woocommerce as our shop plugin. I’ve been testing the CSV import option for uploading all our products and I’m a little concerned on two fronts: -
- Product Variations
- Duplicate content within the product descriptions
**Product Variations: - **
We are selling furniture that has multiple variations (see list below) and as a result it creates c.50 product variations all with their own URL’s.
Facing = Left, Right
Leg style = Round, Straight, Queen Ann
Leg colour = Black, White, Brown, Wood
Matching cushion = Yes, No
So my question is should I 301 re-direct the variation URL’s to the main product URL as from a user perspective they aren't used (we don't have images for each variation that would trigger the URL change, simply drop down options for the user to select the variation options) or should I add the rel canonical tag to each variation pointing back to the main product URL.
**Duplicate Content: - **
We will be selling similar products e.g. A chair which comes in different fabrics and finishes, but is basically the same product. Most, if not all of the ‘long’ product descriptions are identical with only the ‘short’ product descriptions being unique.
The ‘long’ product descriptions contain all the manufacturing information, leg option/colour information, graphics, dimensions, weight etc etc.
I’m concerned that by having 300+ products all with identical ‘long’ descriptions its going to be seen negatively by google and effect the sites SEO.
My question is will this be viewed as duplicate content? If so, are there any best practices I should be following for handling this, other than writing completely unique descriptions for each product, which would be extremely difficult given its basically the same products re-hashed.
Many thanks in advance for any advice.
-
Thanks Matt
-
Well, having the canonical can help you with other situations (people taking your content, you decide to do translations later, etc) so I would go with canonicals first as they're a more robust solution. Parameter solutions in SC only affect Google itself (not Bing, not any other search engine that comes along) as well. Canonicals would help all of them at once - so def the better choice if possible.
-
Thanks Matt, I really appreciate you taking the time out to reply. I will implement the canonical tag for the variation pages.
Our URL's would be parameter based so I could look at the search console solution. Quick question, if I were to de-index the variation pages would adding the canonical tag be a waste of effort/the same thing?
-
Yes, you should be implementing canonical tags back to the main product page.
Also, if your c.50 URLs are parameter based (ie. /product?color=red) than you can also deal with the indexation of those in Search Console. Google gives you the option to set the options for each parameter. (You can also deal with parameters in robots.txt but unless you have to, I would do it through Search Console instead.)
To set them, go to the Parameters page.
For more information, see Google's help page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To 301 or not to 301?
I have a client that is having a new site built. Their old site (WP) does not use the trailing / at the end of urls. The new site is using most of the same url names but IS using the /. For instance, the old site would be www.example.com/products and the new site, also WP, will be www.example.com/products/. WordPress will resolve either way, but my question is whether or not to go in and redirect each matching non / page to the new url that has the /. I don't want to leave any link juice on the table but if I can keep the juice without doing a few hundred 301s that certainly wouldn't suck. Any thoughts? Sleepless in KVegas
Technical SEO | | seorocket0 -
Duplicate Titles Aren't Actually Duplicate
I am seeing duplicate title errors, but when I go to fix the problem, the titles are not actually identical. Any advice? Becky
Technical SEO | | Becky_Converge0 -
Duplicate Content
SEOmoz is reporting duplicate content for 2000 of my pages. For example, these are reported as duplicate content: http://curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158
Technical SEO | | jplill
http://curatorseye.com/Name=âHolster-Atlasâ---Used-by-British-Officers-in-the-Revolution&Item=4158 The actual link on the site is http://www.curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158 Any insight on how to fix this? I'm not sure where the second version of the URL is coming from. Thanks,
Janet0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Buying multiple domains: misspells & .net, org, etc. & 301's
Hi, an SEO guy told me to buy up domains like ours X.org, net, biz, etc. & mispellings. this could cost over $100/year. Is is worth it for SEO or is it just covering our @ss if competitors want to get stupid and buy those? I don't forsee competitors doing that. What do you suggest? Does Google actually give us points for those AND if we bought them are we supposed to redirect all of them to our site? Should I be doing this for our SEO clients? Thanks.
Technical SEO | | JCunningham0 -
Querystring params, rel canonical and SEO
I know ideally you should have as clean as possible url structures for optimal SEO. Our current site contains clean urls with very minimal use of query string params. There is a strong push, for business purposes to include click tracking on our site which will append a query string param to a large percentage of our internal links. Currently: http://www.oursite.com/section/content/ Will change to: http://www.oursite.com/section/content/?tg=zzzzwww We currently use rel canonical on all pages to properly define the true url in order to remove any possible duplicate content issues. Given we are already using rel canonical, if we implement the query string click tracking, will this negatively impact our SEO? If so, by how much? Could we run into duplicate content issues? We get crawled by Google a lot (very big site) and very large percent of our traffic is from Google, but there is a strong business need for this information so trying to weigh pros/cons.
Technical SEO | | NicB10 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
301 Redirect & re-use
I have an old site which is being moved to a new tld due to re-branding. I understand I would do a series of 301 redirects from the pages of the old site to capture the authority and move to the new site. However, at some point in the future (probably 1-2 years) we may want to re-use the old site again for a different brand (it has a good brand, just not for what we're going after). Question is - can a redirected site be re-used at some point in the future? And if so, which site would new authority (links, etc.) go to?
Technical SEO | | uwaim20120