URL Parameters
-
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+
Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled.
Our robotos.txt files shows now:
# Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
-
Hi,
You might want to read this article on faceted navigation on the google webmaster blog which gives some good advice on how to handle the situation. What to use depends a bit on your actual situation.
Options include using a nofollow links / use a separate subdomain or block in robots.txt (using a separate folder).On Moz there is this article (the part of faceting) - its mainly about listing sites - but the core problem is more or less similar.
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing site URL structure
Hey everybody, I'm looking for a bit of advice. A few weeks ago Google sent me an email saying all pages with any text input on them need to switch to https for those pages. This is no problem, I was slowly switching the site to https anyway using a 301 redirect. However, my site also has a language subfolder in the url, mysite.com/en/ mysite.com/ru/ etc. Due to poor work on my part the translations of the site haven't been updated in a long time and lots of the pages are in english even on the russian version etc. So I'm thinking of just removing this url structure and just having mysite.com My plan is to 301 all requests to https and remove the language subfolder in the url at the same time. So far the https switching hasn't changed my rankings. Am I more at risk of losing my rankings by doing this? Thanks!
Technical SEO | | Ruhol0 -
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
How important is keyword usage in the URL?
Hi,
Technical SEO | | Whebb
We have a client who has engaged us recently for some SEO work and most of their website looks pretty good seo-wise already. Many of their site pages rank at the top or middle of page two for their targeted keywords. In many cases they are not using the targeted keyword in the URL and most pages could use some additional on-page clean up. My question is, is it worth it to re-write the URLs to include the targeted keyword and then do 301 redirects to send the old pages to the new ones in order to improve the ranking? Or should we just do the minor on page work in hopes that this will be enough to improve the rankings and push them on to the first page. Thanks.0 -
Friendly URLS (SEO urls)
Hello, I own a eCommerce site with more than 5k of products, urls of products are : www.site.com/index.php?route=product/product&path=61_87&product_id=266 Im thinking about make it friend to seo site.com/category/product-brand Here is my question,will I lost ranks for make that change? Its very important to me know it Thank you very much!
Technical SEO | | matiw0 -
Friendly URL
Can be Friendly URL installed on a custom made jobsite using mod rewrite / apache without any big interference to the system itself? Thank you.
Technical SEO | | tomaz770 -
URL query strings and canonical tag
Hi, I have recently been getting my comparison website redesigned and developed onto wordpress and the site is now 90% complete. Part of the redesign has meant that there are now dynamic urls in the format: http://www.mywebsite.com/10-pounds-productss/?display=cost&value=10 I have other pages similar to this but with different content for the different price ranges and these are linked to from the menus: http://www.mywebsite.com/20-pounds-products/?display=cost&value=20 Now my questions are: 1. I am using Joost's All-in-one SEO plugin and this adds a canonical tag to the page that is pointing to http://www.mywebsite.com/10-pounds-products/ which is the permalink. Is this OK as it is or should i change this to http://www.mywebsite.com/10-pounds-products/?display=cost&value=10 2. Which URL will get indexed, what gets shown as the display URL in the SERPs and what page will users land on? I'm a bit confused so apologies if these seem like silly questions. Thanks
Technical SEO | | bizarro10000 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0