Duplicate pages, overly dynamic URL’s and long URL’s in Magento
-
Hi there,
I’ve just completed the first crawl of my Magento site and SEOMOZ has picked up 1,000’s of duplicate pages, overly dynamic URL’s and long URL’s due to the sort function which appends URL’s with variables when sorting products (e.g. www.example.com?dir=asc&order=duration).
I’m not particularly concerned that this will affect our rankings as Google has stated that they are familiar with the structure of popular CMS’s and Magento is pretty popular.
However it completely dominates my crawl diagnostics so I can’t see if there are any real underlying issues.
Does anyone know a way of preventing this?
Cheers,
Al. -
You should use the Yoast Robots extension to fix almost all the duplicate content.
http://www.magentocommerce.com/magento-connect/yoast-metarobots.html
When using 2.0 Magento connect: http://connect20.magentocommerce.com/community/Yoast_MetaRobots
for 1.0 use: magento-community/Yoast_MetaRobots
Also use canonical URL. You can find this at the admin panel:
System - Configuration - Catalog - Canonical links for catagories
System - Configuration - Catalog - Canonical links for products
-
I'm actually a fan of selectively (programmatically) NOINDEX'ing like that. I find that the GWT parameter blocking doesn't always scale well. I'm running into a lot of clients trying to use it on 100s or 1000s (or millions, actually) of pages and Google is mostly ignoring it. Very frustrating.
We're working on features to let you ignore certain warnings/notices if you feel they don't apply, I but I do believe in being proactive about indexation issues. I think they matter a lot more than they used to, especially post-Panda.
I would double-check to see if there's a Magento plug-in to help, as this could be a common problem. Unfortunately, we don't have any Magento experts on-staff. I'll leave this open as a discussion question, in case any members have specific expertise.
-
Is it worth trying to tackle this programmatically e.g. if url includes dir= or limit= or order= then include a noindex meta tag on that page?
It’s easy to exclude these parameters in Google Webmaster tools, but again I’d really like to reduce the number of errors reported by seoMOZ as currently I have 10,000 errors due to duplicate content!
-
Hey Harald, Thanks for your response - I've come across that article whilst googling the issue, but it doesn't specifically deal with the duplicate URL's being crawled and being included in SEOmoz reports. As I say I'm not too worried about any negative impact here as I've implemented canonical URL's and I have a sitemap - however it ruins my SEOmoz crawl diagnostic report by creating 1,000's of errors. Cheers, Al.
-
Hi Almenzies, As you mentioned that SEOmoz repots you by telling that there area 1000 of pages which are having the issues of duplicate content , so below is alink which solves the Duplicate content issues:
Solving the Duplicate Content Issues in Magento.
I hope that your query had been solved.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with duplicate pages on Shopify
Moz is alerting me that there's about 60 duplicate pages on my Shopify ecommerce site. Most of them are products. I'm not sure how to fix this since the coding for my site is in liquid. I'm not sure if this is something I even need to be worried about. Most of these duplicate pages are a result of product tags shopify sites use to group products you tag with characteristics that the user can select in the product view. here are a couple URLS: https://www.mamadoux.com/collections/all/hooded https://www.mamadoux.com/collections/all/jumpers https://www.mamadoux.com/collections/all/menswear
Technical SEO | | Mamadoux0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
How long does it take for Google for deindexing pages?
Hi mozzers, We just launched a mobile website(parallel) and realized that it created many duplicate content with desktop URLs. I decided to add name="robots" content="No index, No follow" /> to the entire mobile site. My only concern is that I am still seeing the mobile site indexed when it's been almost a week I added these tags. Does anyone know how long it takes google to deindex your content? Thanks
Technical SEO | | Ideas-Money-Art0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
New Magento shop - how to best avoid duplicate content?
Hi all, My clients are about to have th elatest version of the free Magento store set up. It will sell in at least to different languages, so this need to be taken into account. Could any of you give some advice on what is the best way to avoid DC (if possible)? The shop is by now clean (from dc) but from experience I konw this will no continue... Thanks, Best regards, Christian
Technical SEO | | sembseo0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
What's the SEO impact of url suffixes?
Is there an advantage/disadvantage to adding an .html suffix to urls in a CMS like WordPress. Plugins exist to do it, but it seems better for the user to leave it off. What do search engines prefer?
Technical SEO | | Cornucopia0