150+ Pages of URL Parameters - Mass Duplicate Content Issue?
-
Hi we run a large e-commerce site and while doing some checking through GWT we came across these URL parameters and are now wondering if we have a duplicate content issue.
If so, we are wodnering what is the best way to fix them, is this a task with GWT or a Rel:Canonical task?
Many of the urls are driven from the filters in our category pages and are coming up like this: page04%3Fpage04%3Fpage04%3Fpage04%3F (See the image for more).
Does anyone know if these links are duplicate content and if so how should we handle them?
Richard
-
Hi Richard
Honestly, I really don't know. A lot of me wants to say that: "Surely Google will know this isn't deliberate and manipulative duplicate content". You could take a couple of those URLs and do a Google search with them. Do:
site:www.example.com/page?query1
info:www.example.com/page?query1With the first result, if your URL hasn't been indexed, that's a good thing. For the second result, if the info search returns the original URL (without the parameters), that's also good, as it means Google will be counting the one with parameters as just a variation and to be ignored. However, if it's returning the result with the parameters, that would indicate that the web crawler is indexing the version with parameters and treating it as a separate URL - raising the duplicate content risk. Silly Google!
Regardless of those results, I would look to implement the canonical tag anyway as it takes any guesswork out of the equation. And ultimately, a lot of this work with Google is guesswork as we can't see the algorithm - although it's an informed guess due to experience etc.
-
Thanks for this Tom, great answer!
So am I right in thinking that each of these URL Parameters are very likely being classed as duplicate content?
-
Along with this great answer from Tom, I just wanted to add that Google does offer a resource on duplicate content as well with tips.
Hope this helps as well - good luck!
-
Hi Richard
It is something you should address ASAP. While I believe that Google is a lot better at recognising 'accidental' duplicate content - IE URLs with URL parameters - and distinguishing it from 'deliberate' duplicate content - just outright stealing someone's work or trying to rank several pages for multiple terms - that is only my assumption. To be completely sure, let's stop any chance of Google penalising these pages.
I think, in this instance, a rel canonical tag should do the trick. You can read more on the tag here in Moz's guide. Basically, on the page(s) where you're having this problem add a "self-referring" canonical tag. For example, if the page was http://www.example.com/blue-widgets/, the tag would be:
Make sure that, when you implement this, the pages that are generated with the URL parameters aren't also creating canonical tags like:
They should all have the original canonical tag.
What this will do is tell Google that "If you see any pages with this tag, we're aware that they might be duplicate, but please only count and index the http://www.example.com/blue-widgets/". It works just like a 301 redirect in that sense.
I think this would be the simplest solution for you to implement. If you're having problems, there would be a way of blocking access to pages with certain query/URL parameters by using the robots.txt file, but that could get quite messy.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
Duplicated content in moz report due to Magento urls in a multiple language store.
Hi guys, Moz crawl is reporting as duplicated content the following urls in our store: http://footdistrict.com and http://footdistrict.com?___store=footdistrict_es The chain: ___store=footdistrict_es is added as you switch the language of the site. Both pages have the http://footdistrict.com" /> , but this was introduced some time after going live. I was wondering the best action to take considering the SEO side effects. For example: Permanent redirect from http://footdistrict.com?___store=footdistrict_es to http://footdistrict.com. -> Problem: If I'm surfing through english version and I switch to spanish, apache will realize that http://footdistrict.com?___store=footdistrict_es is going to be loaded and automatically it will redirect you to http:/footdistrict.com. So you will stay in spanish version for ever. Deleting the URLS with the store code from Google Web Admin tools. Problem: What about the juice? Adding those URL's to robots.txt. Problem: What about the juice? more options? Basically I'm trying to understand the best option to avoid these pages being indexed. Could you help here? Thanks a lot.
Technical SEO | | footd0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0 -
Duplicate Content
We have a main sales page and then we have a country specific sales page for about 250 countries. The country specific pages are identical to the main sales page, with the small addition of a country flag and the country name in the h1. I have added a rel canonical tag to all country pages to send the link juice and authority to the main page, because they would be all competing for rankings. I was wondering if having the 250+ indexed pages of duplicate content will effect the ranking of the main page even though they have rel canonical tag. We get some traffic to country pages, but not as much as the main page, but im worried that if we remove those pages and redirect all to main page that we will loose 250 plus indexed pages where we can get traffic through for odd country specific terms. eg searching for uk mobile phone brings up the country specific page instead of main sales page even though the uk sales pages is not optimized for uk terms other than having a flag and the country name in the h1. Any advice?
Technical SEO | | -Al-0