SEOMOZ and non-duplicate duplicate content
-
Hi all,
Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework.
Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same.
Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress).
I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not?
Here is a URL and one of its "duplicates" according to the SEOMOZ report:
http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
http://www.1010direct.com/TDV-019-GOLD-50/details.aspxThanks for any help people
-
The point I'm trying to get across is this:
"I asked the question of why these pages are considered duplicate, the answer appears to be : because textually they are even if visually they are not."
I don't think that's the complete answer, or even the most important part of the answer. Surely having mostly similar content across pages won't help, but as I've tried to point out, there are other factors that come into play here. It's not just about the content, but putting the content into context for the search engines. In order for them to understand what it is they're looking it, there's more that's important than just the content.
Michel
-
I think this highlights the fundamental problem with SEO and
eCommerce sites.We are all aware that the ultimate aim for search engines and
therefore ultimately SEO is to add value to users. But is "value" the
same for an eCommerce site as it is for a blog, or a travel information site or
a site offering information on health and advice?In my opinion, it is not. If I am looking to make a purchase, I
am looking for a site that is responsive, easy to navigate, has good imagery to
help me visualise, is secure and doesn’t clutter with in-your-face promotional
info, and of course offers value for money.Unique content therefore doesn’t really factor into it too much. Its hard enough for us, but I can only imagine how difficult it is for a company selling screws or rope, just how much creativity does that take to provide unique content for 3.5 inch brass screws over 2.5 inch steel ones?
The current mantra is to stop worrying about SEO tricks, and
focus on building a site with value. But this particular issue is an indication
we are still not there with that utopia yet.For example, as pointed out in the posts above .. these pages are considered duplicate, because by percentage the variable information is minimal; If you look at our product page we put the functionality of filling in your prescription below the product to make it
easier for the customer, but in order to solve the "percentage unique" issue, we would need to move that onto another page. Basically, we need to reduce value (convenience) to appear to add value (uniqueness).Anyway, little point complaining, I asked the question of why these pages are considered duplicate, the answer appears to be : because textually they are even if visually they are not.
I could be worrying about nothing, I believe all these pages are indexed (through crawling), its just a good proportion of our sitemap is being overlooked, I am assuming its perceived duplication as suggested in SEOMOZ. That in turn makes me concerned google is marking us down as spammy.
I appreciate all your comments.
Thanks
Paul
-
I do not agree. I see these kinds of pages on e-commerce websites on a daily basis. For webshops that sell only a certain kind of product, almost all product pages will look alike.
In this case, the H1 is different, the page title is different, and the description is different. This is only a small portion of the page but that's not uncommon, so I would argue that it cannot be just that.
I would look into URLs, marking up your data using http://schema.org/Product, possibly making small changes to accomodate the tags. For instance splitting up brand, color etc. so that you can mark them accordingly.
-
Tom has this spot on. Google doesn't only look for direct duplication, but also very similar, and these really are I'm afraid.
You need to find ways to make each page unique in its own right - let Google see that no two pages are the same and there is a real reason to rank them.
-
I wonder if the details.aspx has something to do with it?
www.1010direct.com/TDV-019-GOLD-50/details.aspx
www.1010direct.com/DGV-DD1165-970-53/details.aspxBasically, both pages are called details.aspx. Depending on how you look at it, you have 2 pages that are named the same (with mostly similar content, though not unusual for e-commerce websites) in different subfolders. I'm not sure if there's some kind of difference in the way Moz works, and if that's part of why Moz marks this as duplicate content?
Are you unable to create 'prettier' URL's? Such as:
www.1010direct.com/tim-dilsen-019-gold-50-glasses.aspx
www.1010direct.com/dolce-gabbana-dd1165-970-53-glasses.aspxWith or without the aspx of course.
-
I'm not surprised Moz is flagging those pages as duplicate content and I wouldn't be totally surprised if Google did in the future.
Put it this way, the pages are identical bar for a single sentence title description, a price and roughly a 20 word section describing the product. Everything else is identical. It's duplicate.
Look at it another through Google's eyes. Here's how the two pages look when crawled by Google:
(If that doesn't work, try yourself at http://www.seo-browser.com/)
Just look at how much text and HTML is shared between the two pages. Yes, there are key differences on the pages (namely the product), but the Google bot nor the Mozbot is going to recognise those elements when it crawls it.
Presuming Google ignores the site nav, it still has a bunch of text and crawlable elements that are shared - pretty much everything under the product description. It doesn't see the individual images and the flavour text is frankly too small to make any sort of dent in the duplicate content %.
I'd seriously recommend at revising how your product pages look - there's far too much repeated content per page (you can still promote these things on each page but in a much, much smaller way) and the individual descriptions for the products, in my eyes, are not substantial enough.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Duplicate content - font size and themes
Hi, How do we sort duplicate content issues with: http://www.ourwebsite.co.uk/ being the same as http://www.ourwebsite.co.uk/StyleType=SmallFont&StyleClass=FontSize or http://www.ourwebsite.co.uk/?StyleType=LargeFont&StyleClass=FontSize and http://www.ourwebsite.co.uk/legal_notices.aspx being the same as http://www.ourwebsite.co.uk/legal_notices.aspx?theme=default
Technical SEO | | Houses0 -
404 and Duplicate Content.
I just submitted my first campaign. And it's coming up with a LOT of errors. Many of them I feel are out of my control as we use a CMS for RV dealerships. But I have a couple of questions. I got a 404 error and SEO Moz tells me the link, but won't tell me where that link originated from, so I don't know where to go to fix it. I also got a lot of duplicate content, and it seems a lot of them are coming from "tags" on my blog. Is that something I should be concerned about? I will have a lot more question probably as I'm new to using this tool Thanks for the responses! -Brandon here is my site: floridaoutdoorsrv.com I welcome any advice or input!
Technical SEO | | floridaoutdoorsrv0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0