Woocommerce SEO & Duplicate content?
-
Hi Moz fellows,
I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things.
All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors.
But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed.
For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example)
The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises...
Anyone has experience/comments about this? I use SEO by Yoast plugin.
Your help is greatly appreciated! Thank you in advance.
-Marc
-
Same thing happening for me. Interested in this as well.
-
Does anyone have any thoughts on this? I have received a number of errors from MOZ on this and not sure what to do. Moz also states to use Canonical but I already am, so this is a tad confusing.
-
I am keenly interested in your question as I've been working with WooCommerce a LOT lately. I'm an avid Wordpress SEO by Yoast user as well.
Have you considered using Yoast's SEO for WooCommerce plugin? They have a paid version specifically for WooCommerce. One route is to buy the plugin and get support from them on this.
Also, Joost de Valk from Yoast has an extensive guide on duplicate content on his site. I wasn't able to go through the whole thing, but will definitely be coming back to it. He may well have addressed this question in the guide. Either way, I'd go that route. Yoast really knows this space.
Looking forward to what others say about this.
-
canonical links is what you need, so that duplicate pages link to the most relevant page.
I'm afraid I'm no Woocommerce expert, but if it's worth it's weight in salt it will have canonical link capability.
A less important duplicate page should have the following in the header of the page to promote the more relevant page.
<link rel="canonical" href="http://www.website.com/most-relevant-product" />
What this will do is promote your one page in Google and the like above the others which may not even get listed.
What it does mean is you will be seen above other sites. I dare say that the number of canonical links will add to your importance saying to google not only do I have this great page but lots of others which point to it, but I only want to take up your resource listing one.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
User Intent - Office Chairs & Content Writing
Hi I'm trying to look for ideas for content on office chairs. Any ideas on where to start with user intent for this type of search query? I'm using answer the public to gather some ideas. Some are good ideas, but I can't actually find any search volume for the phrase so then I'm unsure whether to devote time to writing something. Surely most people want to just find a supplier, buy the chair & they don't want a huge informational piece how to buy a chair. Our competitors are the likes of amazon, and a load of other huge companies with high DA - so I'm looking at types of content we can write, people are interested in reading about chairs, which is less competitive..I'm not sure that exists... Any help is appreciated 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Pagination & duplicate meta
Hi I have a few pages flagged for duplicate meta e.g.: http://www.key.co.uk/en/key/workbenches?page=2
Intermediate & Advanced SEO | | BeckyKey
http://www.key.co.uk/en/key/workbenches I can;t see anything wrong with the pagination & other pages have the same code, but aren't flagged for duplicate: http://www.key.co.uk/en/key/coshh-cabinets http://www.key.co.uk/en/key/coshh-cabinets?page=2 I can't see to find the issue - any ideas? Becky0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
How Many Words in Content for Good SEO?
I have heard it's best to have 400+ words of content for strong SEO per page. I believe this is true for the most. I have a project in mind, however, that I am considering doing 100-200 words of content per page. This is for a glossary of terms for my industry, where I have a unique page for each term that describes what that term means w/ 1 image and a few links to related products. Is having just 100-200 words going to be enough? Each page will still be unique, original content. Or is it best to really try for longer articles? In other words, is there a general rule for # of words per page for search engines to see the page as valuable and unique and to give it good ranking? Give me a BIG THUMBS UP if you found this question useful. It won't cost you anything! Thanks!
Intermediate & Advanced SEO | | applesofgold0 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0