Which is The Best Way to Handle Query Parameters?
-
Hi mozzers,
I would like to know the best way to handle query parameters.
Say my site is example.com. Here are two scenarios.
Scenario #1: Duplicate content
example.com/category?page=1
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-headerAll have the same content.
Scenario #2: Pagination
example.com/category?page=1
example.com/category?page=2 and so on.What is the best way to solve both?
Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only.
For solving the duplicate content issue, do we need to use canonical tags on each such URL's?
I am not using WordPress. My site is built on Ruby on Rails platform.
Thanks!
-
The new pagination advice is really tough to navigate. I have mixed feelings about rel=prev/next (hard to implement, doesn't work on Bing, etc.) but it seems generally reliable. If you have pagination AND parameters that impact pagination (like sorts), then you need to use prev/next and canonical tags. See the post Alan cited.
I actually do think NOINDEX works fine in many cases, if the paginated search (pages 2+) have little or no search value. It really depends on the situation and the scope, though. This can range from no big deal at all to a huge problem, depending on the site in question, so it's tough to give general advice.
I'm not having great luck with GWT parameter handling lately (as Alan said), especially on big sites. It just doesn't seem to work in certain situations, and I have no idea why Google ignores some settings and honors others. That one's driving me crazy, actually. It's easy to set up and you can try it, but I wouldn't count on it working.
-
no dont de-index them, just use prev next,
yes you are right it is only for google, i really can not give you an answer as what to do for both, you could use canonical for bing only. its a hard one
see this page, for more info http://googlewebmastercentral.blogspot.com.au/2011/09/pagination-with-relnext-and-relprev.html
-
Which do you think is ideal?
De-Indexing Pages 2+ or simply using the rel=next, rel=prev? That's also only for Google right?
-
For the first senario use a canonical tag.
for the second use the prev next tags, this to google will make page one look like one big page with all the content of all the pages on it.
dont use parrametter handing, it is a last resort, it is only for google (though bing has its own), and its effectiveness has been questioned.
-
The problem is that we are talking about thousands of pages and manually doing it is close to impossible. Even if it can be engineered, it will take a lot of time. Unless Webmaster tools cannot effectively handle this situation, it doesn't make sense to go and change the site code.
-
Hi Mohit,
Seems like a waste of time to me when you can put a simple meta tag in there.
-
How about using parameter handling using Google Webmaster tools to ignore ?page=1, ?order=updated_at+DESC and so on. Does that work instead of including canonical tags on all such pages?
-
I can speak to the first scenario, that is exactly what the purpose of the rel="canonical" is for. Dynamic pages in which have a purpose for url appendages.Or in the rare case where you can't control your server (.httaccess) for 301 redirects.
As for pagination, I may not have the best answer as I have also been using rel="canonical" in those cases as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
Writing of url query strings to be seo frinedly
I understand the basic concepts of url write and creating inbound and outbound rules. I understand the creating of rules to rewrite url query strings so that it’s readable and seo friendly. It’s simple when dealing with a small number of pages and database records. (Microsoft Server, asp.net 4.0, IIS 7) However, I need to understand the concept to handle this: Viz the following: We have a database of 10,000+ establishments, 650+ cities, 400+ suburbs. Each establishment can be searched for by country, province, city and suburb. The search results show establishments that match the search criteria. Each establishment has its own unique id. Each establishment in the search results table has a link to the establishments detailed profile aspx page. The link is a query string such as http://www.ubuntustay.com/detailed.aspx?id=4 which opens the establishments profile. We need to rewrite the url to be something like: http://www.ubuntustay.com/detailed.aspx/capetown/westerncape/capetown/campsbay/diamondhouse which should still open the same establishment profile as the above query string. I can manually create a rule for this one example above without a problem. But there are over 10,000 establishments, all in different provinces, cities and suburbs. Surely we don’t manually generate a rewrite rule for each establishment? The resulting .htaccess will be rather large(?!) Therefore my questions are: How do I create url rewrite rules for dynamic query strings that originate from a large dataset? How do I translate the id number into the equivalent <country>/<province>/<city>/<suburb>/ <establishment>syntax?</establishment></suburb></city></province></country> Do I have to wire-up the global.asax so that every incoming requests extracts the country, province, city and suburb based on the establishment id which seem a bit cumbersome(?). If you’re wondering how I currently do it (it works but it’s not very portable or efficient): For each establishment which is included on the search results I simply construct the link url as: http://www.ubuntustay.com/detailed.aspx/4/Diamond%20House/Camps%20Bay/Cape%20Town On the detailed.aspx page load I simply extract the record id (4 in the example above) from the querystring and select that record from the db. Claude, what I’m looking for is advice on the best approach on how to create these rewrite rules and would be grateful if you can have one of your SEO friends lend their advice and experience. Any web resources that show the above techniques would be great. I’m not really looking for simple web links to url rewriting overviews…I have plenty of those. It’s the detail on the specific requirement above that I need please.
Technical SEO | | claudeSteyn0 -
What is the best way to remove and fight back backlink spam?
Removing low quality and spam backlinks. What is the most effective clean-up process?
Technical SEO | | matti_wilson0 -
Industry News Page Best Practices
Hi, We have created an industry news page which automatically curates articles from specific news sources within our sector. Currently, I have the news index page set to be indexed and followed by robots. I have the article pages noindex, nofollow, since these are not original content. Is this the best practice or do you recommend another configuration? Thanks!
Technical SEO | | JoshGFialkoff0 -
Merging several sites into one - best practice
I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains....
Technical SEO | | gossi740 -
Query string in url - duplicate content?
Hi everyone I would appreciate some advice on the following. I have a page which has some nice content on but it also has a search functionality. When a search is run a querystrong is run. So i will get something like mypage.php?id=20 etc. With many different url potentials, will each query string be seen as a different page? If so i don't want duplicate content. So am i best putting canonical tags in the head tags on mypage.php ? to avoid Google seeing potential duplicate content. Many thanks for all your advice.
Technical SEO | | pauledwards0 -
Best Way to change domain name
Hi, I have a driving instructor wordpress website called www.drivnglessonsbrightonandhove.co.uk, done a little link building. But before I start this properly, I want to change it to www.drivingbrighton.co.uk (as the first is too long to fit on the car etc!!!!) Whats the best way to do this? Is there a way to help keep its position in google as just got into top10 for different keywords?
Technical SEO | | Ant710