URL Parameter for Limiting Results
-
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions.
1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category?
In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes.
Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page.
2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do.
Any advice or suggestions will be helpful! Thanks.
-
Thanks for your help David - I apologize for my delayed response.
-
Hi Dustin,
Looks like the problem is that you have two canonical tags on your parameter pages.
eg. on lines 24 and 25 of the source code for this page https://www.stickylife.com/custom/vinyl-decals?limit=30 you'll see:
With more than one canonical tag on a page, Google will ignore both canonical tags - which is why you are getting duplicate issues.
You'll need to remove the second canonical tag to overcome your issues.
Cheers,
David
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Fetch and Render - Partial result (resources temporarily unavailable)
Over the past few weeks, my website pages have been showing as partial in the Google Search Console. There are many resources/ files (js, css, images) that are 'temporarily unreachable'. The website files haven't had any structural changes for about 2 years (it historically has always shows as 'completed' and rendered absolutely fine in the search console). I have checked and the robots.txt is fine as is the sitemap. My host hasn't been very helpful, but has confirmed there are no server issues. My website rankings have now dropped which I think is due to these resources issues and I need to clear this issue up asap - can any one here offer any assistance? It would be hugely appreciated. Thanks, Dan
SERP Trends | | dan_550 -
Search results vary in chrome vs other browsers even in Incognito mode: Google's stand?
Hi all, We use incognito mode or private browsing to check the Actual results which are not impacted by previous history, location (sometimes), etc. Even we browse this way, we can see the different search results. Why would this happen? What's Google's stand on this? What is the actual way to browse to get the unbiased results for certain search queries? I have experienced that Chrome will rank our own websites bit higher compared to the other browsers even in incognito mode. Thanks
SERP Trends | | vtmoz1 -
Google News results ...can it be SEOed?
Hello Everyone. I simply wanted to know if anyone had some useful insight on what it takes for a legitimate website to appear within the Google News results. I have rarely, or ever, had to dabble in this kind of SEO, but after coming across a situation with a perfectly legitimate website, I'm now scratching my head. The site in question is a very well established website, with 0 "seo" done to it. All links organic, all traffic legit and they have VERY strong social media presence. The site's current DA is 50. Its a 3 letter domain. Some of the points I believe are important quantity and quality of content (% of aggregated vs actually original content) overall % of "news" content vs rest of the site content authors/writers credentials (how would Google evaluate the authority of a writer, so his/her content is newsworthy?) overall site authority rich snippet and code needed to be indexed? I think rel publisher or rel author tags have something to do with it? making sure to have basic SEO in place: canonical tag, unique headers, etc. What am I missing? They have one particular competitor that seem to be ranking for almost everything news related, while being a similar site in content and authority, however they are nowhere. They have submitted to Google News before (not even sure what that means) but have failed to be included -- does this put a "stain" on them for any reason or impede the possibility of being indexed in the Google News results in the future? ANY input is appreciated.
SERP Trends | | 1stOTLJK0 -
External URLs - ones you can't reach out
My fellow Mozzers, I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages) China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that: I can't contact these chinese sites to remove/edit the URLs. I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present. Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead" Have you tackled anything like this before? Thoughts are welcome Thanks
SERP Trends | | Bio-RadAbs0 -
Should URL Follow Navigation Of A Site?
Following an SEOMoz webinar the other day, where the presenter made a case of eliminating folders in URLs, as these could confuse a bot when crawling a site. I love the idea. However, there are still a lot of Best Practices and guidelines out there that will suggest there should be a logic in the URL, just as there should be in your Navigation. My question in that regard is whether or not there is any value for a bot to crawl a website URL that follows the navigation by "stuffing" the URL with folders, identical to the navigation present on the site, and even a secondary navigation present on all pages? Example: the navigation of a site goes [Domain > Folder > Sub folder > Sub-sub folder > product]. What is the benefit of using a URL such as [www.domain.com/folder/sub-folder/sub-sub-folder/product] vs [www.domain.com/product] Thank you guys for your insights! PS this is a WP site we are talking about.
SERP Trends | | Discountvc0 -
Long URL Warning
Dear experts, I have 1490 warnings for having long URLs These URLs are generates automatically by Prestashop from the product title and they are very readable. Can you please direct me what's the impact of these long URLs on my SEO and how can I reduce them if they are automatically generated? Regards,
SERP Trends | | kanary0 -
Does Google index search results pages for other search engines?
I've noticed a lot of backlinks to sites consist of SERPS from other search engines that Google. A link to a query like: http://searcheninge.com/?q=apple for instance. Does Google index these links and do they give any value? Regards, Henrik
SERP Trends | | euromillions0