Just read Travis Loncar's YouMoz post and I have a question about Pagination
-
This was a brilliant post.
I have a question about Pagination on sites that are opting to use Google Custom Search. Here is an example of a search results page from one of the sites I work on:
http://www.ccisolutions.com/StoreFront/category/search-return?q=countryman
I notice in the source code of sequential pages that the rel="next" and rel="prev" tags are not used. I also noticed that the URL does not change when clicking on the numbers for the subsequent pages of the search results.
Also, the canonical tag of every subsequent page looks like this:
Are you thinking what I'm thinking? All of our Google Custom Search pages have the same canonical tag....Something's telling me this just can't be good.
Questions:
1. Is this creating a duplicate content issue?
2. If we need to include rel="prev" and rel="next" on Google Custom Search pages as well as make the canonical tag accurate, what is the best way to implement this?
Given that searchers type in such a huge range of search terms, it seems that the canonical tags would have to be somehow dynamically generated.
Or, (best case scenario!) am I completely over-thinking this and it just doesn't matter on dynamically driven search results pages?
Thanks in advance for any comments, help, etc.
-
-
Considering that the larger of the two sites I work on is on a platform from 1996, I might actually be living "back in the day!" lol - Thanks again Jared!
-
This would all depend on what the site was built on, and the flexibility. There's no questions that this can be done. "Back in the day" we had a few sites that had tens of thousands of page due to sorting, and we had everything generated including:
Title, meta d, meta k, breadcrmb, H1 and short description.
Those were the days!!!
-
For the most part, I would choose to use rel=prev/next for pagination, including both pagination with dynamic urls and static URLs. There are some cases (as with this original thread question) where you should use canonical, but as a whole you should use rel=prev/next.
The best way to explain it is:
Rel Prev/Next:
Your site: Hi Google, I have all of these pages that very similar so I'm just letting you know that I only have duplicate content here for usability reasons and am in no way inferring that you should index all of these pages and rank them #1!
Google: Ok great, thanks for letting us know. We'll index the pages we feel are appropriate, but you wont get penalized for duplicate content. We may only index and serve one page, "page 1", or we may index multiple pages. Thanks for letting us know.
Canonical:
Your site: Hi Google, I have all these paginated pages that look like duplicate content, please do not include any of them in your index, and don't penalize me for duplicate content. For the record, the page you should index is Page 1 and no other pages.Any links that point to the paginated pages should be counted towards Page 1*.
Google: Great, no matter what we will not index any pagination and only Page 1.
With rel=next you are simply letting Google know, but not dictating how Google should act on the situation. If fact with ecomm sites, youll find that a lot of timees when you use rel=next, Google will actually index the 'view all' page if you have "view all" as an option around your pagination links
*many articles suggest that link juice is passed to the canonical URL - I'm have not seen any direct evidence of this but is worth a different discussion.
-
Yes, Jared, this is a great answer. I understand completed. It looks like we are ok then with Google Custom Search as it is. Thanks so much for your thoughtful answer. Now, if we can only get our paginated category pages sorted out, we'll be on the right track!
-
Hi Gerd,
Yes, this is a separate issue we are also struggling with on the site. I believe Travis' YouMoz post from yesterday made a pretty good case for using multiple paginated URLs, and he even illustrated how to accomplish this with sorting parameters like "color" and "price"
You raise a very good point about duplicate titles and descriptions potentially being a problem in this scenario.
Does anyone have any ideas about how to handle that? Could the backend be programmed to dynamically create unique titles and descriiptions based on some rules for naming conventions? (assuming you have access to that level of the code of course)
Really interested to know some points of view on this!
Dana
-
I raised a similar question in the following Q&A - http://www.seomoz.org/q/duplicate-title-tags-with-pagination-and-canonical
My concern or question (we have rel=prev/next) would be more towards what the canoncial should be. There seems to be different opinions:
1. Use the current paginated page as the canonical - in our case GWMT reports duplicate titles (I suppose appending a page-number should sort this out)
2. Use the base search URL as the canonical - perhaps not a bad choice if your site's content changes and Google indexes page 50, but over time you only have results for 40 pages (resulting in an empty result page)
I currently only can conclude that having the prev/next implemented is a good thing as it will hint Google in pagination (in addition to setup the URL parameters in GWMT). I do plan to change the canoncial to the base search URL (and not having multiple paginated URLs) and see how this will affect indexing and SERPs.
-
Dana
Great and informative question,
Jared
Great Answer
-
Hi Dana - Let me see if I understand this correctly:
In question 1 you asked if this would be a duplicate content issue. The canonical tag retains the exact same URL regardless of the search parameter (and resulting search results). Therefore, regardless of the search being made, Google and other crawlers will not index page with a search parameter since the canonical references to the original url (http://www.ccisolutions.com/StoreFront/category/search-return). This means that when Google accidentally lands here http://www.ccisolutions.com/StoreFront/category/search-return?q=countryman it sees the canonical tag and understands that it should not index this page as it is only a variation of the core page.
This would of course be a problem if you actually wanted Google to index every query page. Alternate methods could be to disclude the query parameter in WMT or Robots. But the canonical is built in for you so that you dont have to.
In situations like this I also like to add site search to analytics and block the query parameter so no query pages show up as landing pages.
-
I understand exactly what you are saying Jared. However, here's the problem, the canonical tag is exactly the same....for every single subsequent page in a series across the entire site.
No matter what is searched. The canonical tag remains:
Wouldn't that mean that all search results pages, regardless of search term, are viewed as the same page?
I have heard this discussed before come to think of it. In this case, wouldn't it be proper to block all dynamic search results pages from being crawled or indexed by Google via the htaccess file or robots.txt file?
-
Hi Dana -
I think in the case of Google Custom Search, there is no need to worry about duplication. The reason is that although the rel="prev" etc tags are not being used, a blanket solution already exists: the canonical tag. As you mentioned, the canonical tag never changes, regardless of the search - therefore the crawlers only ever see the Custom Search page as a single page regardless of the queries being made. Thus there is no duplicate issue.
-
I use Google custom search on my site and love it. I would say you have some valid concerns. At first it was a bit of a pain because some of the images didn't line up with the products after a few weeks it worked itself out. We had a 47% increase in conversion from using Google custom search, I use an out of the box type web service so I cannot help you with a few of the questions. There is a lot of customization you can do to fix that you described. Bringing our blog and recipe section was the purpose for trying it and the revenue proved it to be a wise decision.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameters as pagination
Hi guys, due to some changes to our category pages our paginated urls will change so they will look like this: ...category/bagger/2?q=Bagger&startDate=26.06.2017&endDate=27.06.2017 You see they include a query parameter as well as a start and end date which will change daily. All URLs with pagination are on noindex/follow. I am worrying that the products which are linked from the category pages will not get crawled well when the URLs on which they are linked from change on a daily basis. Do you have some experience with this? Are there other things we need to worry about with these pagination URLs? cheers
Technical SEO | | JKMarketing0 -
Bing rankings question
Hi, We just wrapped up a website redesign about a month ago. The content stayed primarily the same. Once we launched the new site all of our rankings in Google stayed the same but we lost rank for all competitive keywords on Bing. I looked in Bing Webmaster tools and it doesn't show any penalties but it does show that we have too many H1 tags. I don't think the H1 tag thing is the issue but maybe. Do you know what could be causing this?
Technical SEO | | BT20090 -
What's the best Blogging platform
A year ago an SEO specialist evaluated my Wordpress site and said she had seen lower rankings for Wordpress sites--in general. We moved our site off any cms and design in html 5. Our blog, however, is still on Wordpress. I'm thinking about moving to the Ghost platform b/c I only a blog. The drawbacks are one author, no recent post lists, no meta tags. Is it worth it to move the site off Wordpress. Will it affect my rankings much if I have great content? Does anyone have experience with or opinions on Ghost?
Technical SEO | | RoxBrock0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Inconsistent page titles in SERP's
I encountered a strange phenomenon lately and I’d like to hear if you have any idea what’s causing it. For the past couple of weeks I’ve seen some our Google rankings getting unstable. While looking for a cause, I found that for some pages, Google results display another page title than the actual meta title of the page. Examples http://www.atexopleiding.nl Meta title: Atex cursus opleider met ruim 40 jaar ervaring - Atexopleiding.nl Title in SERP: Atexopleiding.nl: Atex cursus opleider met ruim 40 jaar ervaring http://www.reedbusinessopleidingen.nl/opleidingen/veiligheid/veiligheidskunde Meta title: Opleiding Veiligheidskunde, MBO & HBO - Reed Business Opleidingen Title in SERP: Veiligheidskunde - Reed Business Opleidingen http://www.pbna.com/vca-examens/ Meta title: Behaal uw VCA diploma bij de grootste van Nederland - PBNA Title in SERP: VCA Examens – PBNA I’ve looked in the source code, fetched some pages as Googlebot in WMT, but the title shown in the SERP doesn’t even exist in the source code. Now I suspect this might have something to do with the “cookiewall” implemented on our sites. Here’s why: Cookiewall was implemented end of January The problem didn’t exist until recently, though I can’t pinpoint an exact date. Problem exists on both rbo.nl, atexopleiding.nl & pbna.com, the latter running on Silverstripe CMS instead of WP. This rules out CMS specific causes. The image preview in the SERPS of many pages show the cookie alert overlay However, I’m not able to technically prove that the cookiescript causes this and I’d like to rule out other any obvious causes before I "blame it on the cookies" :). What do you think?
Technical SEO | | RBO0 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
On-Page Report Says 'F', and I'm Confoozled As to Why
I'm primarily interested in how we failed in our "Broad Keyword Usage in Title" category. The Keyword Pair we're gunnin' for is: "Mac Windows" Our current page title is: "CrossOver: Windows on Mac and Linux with the easiest and most affordable emulator - CodeWeavers" This is, I grant, ugly. However, bear with me. SEOMoz Report Card says "Easy Fix!" and suggests: "Employ the keyword in the page title, preferrably as the first words in the element." I humbly submit that "Mac" and "Windows" IS in the page title. So what am I missing? Is it the placement of the words relative to each other, or relative to the start of the sentence? Or is the phrase "CrossOver:" somehow blocking the rest of the sentence from being read? Are colons evil? I'm genuinely mystified as to why (from a structural standpoint) our existing title tag is failing this test, and I'd be delighted for answers and/or feedback. Thanks in advance.
Technical SEO | | CodeWeavers0 -
Canonical Question
Our site has thousands of items, however using the old "Widgets" analogy we are unsure on how to implement the canonical tag, and if we need to at all. At the moment our main product pages lists all different "widget" products on one page, however the user can visit other sub pages that filter out the different versions of the product. I.e. glass widgets (20 products)
Technical SEO | | Corpsemerch
glass blue widgets (15 products)
glass red widgets (5 products)
etc.... I.e. plastic widgets (70 products)
plastic blue widgets (50 products)
plastic red widgets (20 products)
etc.... As the sub pages are repeating products from the main widgets page we added the canonical tag on the sub pages to refer to the main widget page. The thinking is that Google wont hit us with a penalty for duplicate content. As such the subpages shouldnt rank very well but the main page should gather any link juice from these subpages? Typically once we added the canonical tag it was coming up to the penguin update, lost a 20%-30% of our traffic and its difficult not to think it was the canonical tag dropping our subpages from the serps. Im tempted to remove the tag and return to how the site used to be repeating products on subpages.. not in a seo way but to help visitors drill down to what they want quickly. Any comments would be welcome..0