What's the best way to solve this sites duplicate content issues?
-
Hi,
The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands.
I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed.
Currently it looks like this...
Main URL
http://www.expressgolf.co.uk/shop/clothing/galvin-green
Different Versions
http://www.expressgolf.co.uk/shop/clothing/galvin-green/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/1
http://www.expressgolf.co.uk/shop/clothing/galvin-green/2
http://www.expressgolf.co.uk/shop/clothing/galvin-green/3
http://www.expressgolf.co.uk/shop/clothing/galvin-green/4
http://www.expressgolf.co.uk/shop/clothing/galvin-green/all
http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/
http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/
Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots?
Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ?
I'm sure this question has been answered but I was having trouble coming to a solution for this one site.
Cheers,
Paul
-
Greetings Champion!
Cannonical Linking is the best way to go!
For your conundrum in Example B with the Calvin Green I would find out which URL version is dominant or has the most link juice and structure the redirects to that links. For instance, let us say [http://www.expressgolf.co.uk/shop/clothing/galvin-green/] is the dominant link I would have the copy urls pointing back to that one like so
I manage an Ecommerce Site as well, I had the same issue with the categories and what I basically did the same thing for each level of my site, categories that have the url ending in /1, /2, /3 and so on I would redirect to the first page to make that Url stronger.
Practically think about what Noindexing or Nofollowing would do, you would lose so much link power that you could harness, With redirects you can focus this power in a more effective manner.
Also I was snooping and clicked Golf Clubs which took me to this URL
expressgolf.co .uk/shop/clubs
Made me realize that you had "clubs" at the end, I would put Golf Clubs because when you do a keyword search for Clubs, Bars come up. Putting a Keyword in the URL can greatly enhance that URL. Just a tip if you want to utilize it or not friend ^.^.
Good luck on your quest for Page 1!
Justin Smith
-
It really sounds like canonical is what you need here. Here is Matt Cutts explaining:
-
Hi,
I would say canonical or rel=”next” and rel=”prev”
http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
Would probably be your best bets, as a 301 would solve from a Search Engine perspective, but definitely not good for users wanting to see different versions of the product in question.
hope this helps
w00t!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Our client's site was owned by former employee who took over the site. What should be done? Is there a way to preserve all the SEO work?
A client had a member of the team leave on bad terms. This wasn't something that was conveyed to us at all, but recently it came up when the distraught former employee took control of the domain and locked everyone out. At first, this was assumed to be a hack, but eventually it was revealed that one of the company starters who unhappily left the team owned the domain all along and is now holding it hostage. Here's the breakdown: -Every page aside from the homepage is now gone and serving a 404 response code -The site is out of our control -The former employee is asking for a $1 million ransom to sell the domain back -The homepage is a "countdown clock" that isn't actively counting down, but claims that something exciting is happening in 3 days and lists a contact email. The question is how we can save the client's traffic through all this turmoil. Whether buying a similar domain and starting from square one and hoping we can later redirect the old site's pages after getting it back. Or maybe we have a legal claim here that we do not see even though the individual is now the owner of the site. Perhaps there's a way to redirect the now defunct pages to a new site somehow? Any ideas are greatly appreciated.
Technical SEO | | FPD_NYC0 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
Best practice: unique meta descriptions on blog 'tag' pages
Hi everyone, I'm curious, are there best practices for introducing unique meta descriptions on blog tag pages (I'm using wordpress)? For instance, using platinum seo, on an original post, the meta description is either the excerpt or a specified custom sentence. It doesn't appear that platinum seo allows for custom descriptions on tag pages. Love to hear your thoughts. Thanks! Peter
Technical SEO | | peterdbaron1 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0 -
Best Way to Handle - International Content - Different Language
Our site currently is focused in the USA and the entire site is in the English language. We have considered broadening our scope to include content from foreign countries - i.e. Brazil. What is the best way to approach this -- can we use our existing domain and just have a specific section of the site that is dedicated to a particular Country with content translated into that Country's predominant language? OR could this create SEO issues -- having a domain with both English and some other language? Would it be better to have this on a totally different domain with Country extension? This is totally foreign territory for me - bad pun intended. Any advice, help would be appreciated. Thanks. Matt
Technical SEO | | MWM37720 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0