Handling "legitimate" duplicate content in an online shop.
-
The scenario:
Online shop selling consumables for machinery.
Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with.
The Problem:
Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z.
Machine A page = Machine B page = Consumables range A page
Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc).
If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic.
However, if I don't use canonical tags then all the pages get slammed as duplicate content.
For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format.
However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same.
The Question:
What is the best way to handle this from both a user and search engine perspective?
-
That's good solid advice. Thank you. Other ecommerce sites in the niche are nothing to write home about. Where they win is where this site has some major issues (larger than the one I'm asking about) that need fixing too.
I'm just trying to come up with a cohesive plan for a site that will blow the competition out of the water on Google (achievable) and increase sales / visitor. This is part of that.
Much as I don't like your suggestion due to the amount of work it is going to take to implement, I do think you are right and it's a better solution than the canonical tags.
That said, I suspect the canonical tags will be tried first, and then we will end up going with the content writing.
-
I have spent an inordinate amount of time cleaning up sites with templated pages and duplicate content. I can tell you that the potential gains are real, and the potential risks of inaction are often large.
Some text is better than no text. Google prefers a solid base of text-based content, period. It's their bread and butter and it helps them figure out what your page is about. Some time spent discussing with your team/writers how to best differentiate each page could be time very well spent. I don't know that it needs to be a solid block of prose; categories like manufacturer, machine type, year, etc. could be used in list or paragraph form (perhaps you already do this?)
You could look at other ecommerce sites that are ranking in your niche, and in others, to see what they do.
I'm not sure exactly what you mean by "The text would have to come before the products for SEO." What I will say is the position of the text on the page should probably be dictated by whatever is best for the user. Test it in different positions on the page (even the left or right sidebar) and see what converts better. I doubt if the text's position on the page will affect your rankings a great deal. As for the "boilerplate-ness," the crawlers can see if its unique or not.
-
I did consider that. It's in the hundreds and it could be done, but I'm not sure that's the way to go for the following reasons:
1. The number of consumables in the list is going to be at least 8 per page (each with snippet information and order box). That means that it would take a significant amount of text to make the content significantly different.
2. There is not a lot of difference between many of the machines, so writing a decent amount of text per item would be a major task. The more text written the more it affects point 3.
3. The text would have to come before the products for SEO (after the products just looks like the boilerplate-esque text that it would actually be) and that's not good for the consumer who just wants to see the consumables.
Also, we are finding more problems with the site every hour and we may not have the resources to get the text accomplished in a reasonable time-frame. Certainly, I'd have to be more certain of getting a "win" from it than I currently am before I suggest spending on it over other issues.
-
Ian,
Is it feasible to write unique text for the machine pages? I.e., are they in the hundreds or thousands? Do you have a budget to hire a writer(s)?
-
I may be missing something, but wouldn't canonical tags sort out your sort orders at least?
-
I have the same problem but it is listed as duplicate content within my site as a result of sort mechanisms and category pages. The consumer wants the sort mechanisms and category pages in order to find the products they are looking for quickly and I've tried everything and still have "duplicate content" listed on Moz crawls and Google Webmaster and just about everything. Imaging won't work for me since it is a result of database search mechanisms which cause the "duplicates" I also have canonical urls on pages but that doesn't solve the problem either. I think we are damned if we do and damned if we don't.
-
Hi Ian,
There is a way around it, but first an opinion on duplicate content. I think that duplicate content issues are really about duplication across websites, not duplication within websites. Store ABC is expected to have a fair amount of text that repeats across it's own pages. The problem arises when both Store ABC and Store LMN and Store TUV all have the same bits of content (like product descriptions).
But anyways, if you really do not want to have your lists of consumables repeated on multiple pages, just turn the lists into images....
Then on the Machine A page display the image of the list and give it a file name and alt tag like "Machine A Consumables" and "Consumables for Machine A".
And on the Machine B page display a COPY of the image of the list and give it a NEW file name and alt tag like "Machine B Consumables" and "Consumables for Machine B". Etc, etc...
For the visitor, there is no difference between reading the words from text or an image. (unless they have sight issues and are using a screen reader)
Does this solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does this type of writing follow the "original content" criterion of structured data?
Hi!' So, in Google's general guideline for structured data, it's stated that the webmasters must "provide original content that you or your users have generated." If I were to write an article about post similar to stuff like "how to get a driver's license" or "how to apply for an accounting license", which requires looking up information from official and non-official sources. After researching, I compiled the information I found and wrote a few blog posts. Are these considered original content? Can I apply structured data to these posts without Google penalizing them? Thanks!
Technical SEO | | EverettChen0 -
Duplicate content w/ same URLs
I am getting high priority issues for our privacy & terms pages that have the same URL. Why would this show up as duplicate content? Thanks!
Technical SEO | | RanvirGujral0 -
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
How unique does a page need to be to avoid "duplicate content" issues?
We sell products that can be very similar to one another. Product Example: Power Drill A and Power Drill A1 With these two hypothetical products, the only real difference from the two pages would be a slight change in the URL and a slight modification in the H1/Title tag. Are these 2 slight modifications significant enough to avoid a "duplicate content" flagging? Please advise, and thanks in advance!
Technical SEO | | WhiteCap0 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0