GWT, Editing URL Parameters for Ecommerce Features
-
I have had the setting of "let googlebot decide" on managing my URL parameters on an Ecommerce site in Magento. The products I sell come in different sizes and colors and finishes etc.
These parameters are showing up in Google Webmaster Tools and set for "let googlebot decide". Some of them have as many as 8 million urls monitored.
I changed the editing option to clam these parameters as "narrow searches", but still left the option to "let googlebot decide" (versus block urls).
Will blocking these erroneous urls serve any benefit? Does blocking these help with the crawl/seo?
-
Magento will generally canonicalize correctly by default. What I'm guessing you're seeing are category pages that have variables for faceted navigation - again, a pretty common issue in Magento. I've seen and even "fixed" this issue for past clients with no noticeable results.
If you ARE having problems getting your pages into Google's index (check indexation status in Webmaster tools and look for exact product/category URLs to check) there is a plugin that will remove the variables for faceted navigation on category pages. The main downside is that someone who emials/shares a link will not share it with the filters/sorting attached.
Generally I recommend leaving it alone. If you're not having indexation problems, though, it should be fine to leave alone and let Google figure it out.
-
Hi Nathan,
I tend to agree with Patrick, but if you aren't seeing any issues with SERP rankings then you may not wish to go changing anything.
If you plan to go ahead, here are my thoughts.
Don't go blocking the pages rather look into canonical tags, I asked a similar question yesterday and was pointed to some articles that proved useful.
Moz's guide on Canonicalization would be my first go to.
Plus there is always the Google guide on duplicate content which I would imagine you have already looked at - But just in case that's here.
If it's a case of TL;DR - There is little damage that can be done with correctly set up canonical tags, so get in touch with your web developer or hit the research harder and just set them up wither way. Adding the Canonical tags will do no harm and help rule out any duplicate content concerns.
Hope that helps,
Richard
-
Hi there
I wouldn't block these pages so long as you have your parameters set up in Google Webmaster Tools, as well as indicated paginated content.
Remember, there are opportunities with canonical tags as well, so again, I see no reason to block these pages.
As long as you take proper steps to avoid duplicate content and let engines know what is happening, you should be fine.
Hope this helps, this is just my opinion - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure for am International website with subdirectories
Hello, The company I am working for is launching a new ecommerce website (just a handful of products).
Intermediate & Advanced SEO | | Lvet
In the first phase, the website will be English only, but it will be possible to order internationally (20 countries).
In a second phase, new languages and countries will be added. I am wondering what is the best URL structure for launch: Start with a structure similar to website.com/language/content (later on we will add other languages than english) Start with a structure similar to website.com/country/content
3) Start with a structure similar to website.com/country-language/content (at the beginning it will be all website.com/country-en/content) What do you think? Cheers
Luca0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Confused: Url Restructure
Hello, We're giving our website a bit of a spring clean in terms of SEO. The site is doing ok, but after the time invested in SEO, content and last year's migration of multiple sites into one, we're not seeing the increase in traffic we had hoped. Our current urls look something like this: /a-cake-company/cup-cakes/strawberry We have the company name as the first level as we with the migration we migrated many companies into one site. What we're considering is testing some pages with a structure like this: /cup-cakes/cup-cake-company-strawberry So we'll lose a level and we'll focus more on the category of the product rather than the brand. What's your thoughts on this? We weren't going to do a mass change yet, just a test, but is this something we should be focusing on? In terms of organisation our current url structure is perfect, but what about from an SEO point of view? In terms of keywords customers are looking for both options. Thanks!
Intermediate & Advanced SEO | | HB170 -
Is it best to have products and reviews on the same URL?
Hi Moz, Is it better to have products and reviews on the same or different URLs? I suspect that combining these into one page will help with rankings overall even though some ranking for product review terms may suffer. This is for a hair products company with tens of thousands if not hundreds of thousands of reviews. Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Using abbreviations in URL - Matching Keyword
We have a website that uses /us/, /ca/, /va/, etc for URLs of the different U.S. states. How much better is it (or is it at all better) to use /california/ or /virginia/ instead in our URLs to rank for searches that include the name of those states?
Intermediate & Advanced SEO | | Heydarian0 -
Rewriting dynamic urls to static
We're currently working on an SEO project for http://www.gear-zone.co.uk/. After a crawl of their site, tons of duplicate content issues came up. We think this is largely down to the use of their brand filtering system, which works like this: By clicking on a brand, the site generates a url with the brand keywords in, for example: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html filtered by the brand Mammut becomes: http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html?filter_brand=48 This was done by a previous SEO agency in order to prevent duplicate content. We suspect that this has made the issue worse though, as by removing the dynamic string from the end of the URL, the same content is displayed as the unfiltered page. For example http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html shows the same content as: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html Now, if we're right in thinking that Google is unlikely to the crawl the dynamic filter, this would seem to be the root of the duplicate issue. If this is the case, would rewriting the dynamic URLs to static on the server side be the best fix? It's a Windows Server/asp site. I hope that's clear! It's a pretty tricky issue and it would be good to know your thoughts. Thanks!
Intermediate & Advanced SEO | | neooptic0