Canonical pagination content
-
Hello
We have a large ecommerce site, as you are aware that ecommerce sites has canonical issues, I have read various sources on how best to practice canonical on ecommerce site but I am not sure yet..
My concert is pagination where I am on category product listing page.. the pagination will have all different product not same however the meta data will be same so should I make let's say page 2 or 3 to main category page or keep them as is to index those pages?
Another issue is using filters, where I am on any page and I filter by price or manufacturer basically the page will be same so here It seems issue of duplicate content, so should I canonical to category page only for those result types?
So basically If I let google crawl my pagination content and I only canonical those coming with filter search result that would be best practice? and would google webmaster parameter handling case would be helpful in this scenario ?
Please feel free to ask in case you have any queries
regards
Carl -
Google just announced some tags to help support pagination better. They say if you have a view all option that doesn't take too long to load, searchers generally prefer that, so you can rel=canonical to that page from your series pages. However, if you don't have a view all page, then you can put these nifty rel="next" and rel="prev" tags in to let Google know your page has pagination, and where the next and previous pages are.
View all: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
next/prev: http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
-
I checked your site, and don't know whether you already changed it or not, but it looks pretty good. I have dealt with much more hardcore issues, meaning where you have tons of products in each category, several filters which can be freely permutated, and in the meantime you were able to paginate as well. There were a lot of canonical issues, so your case is an easy ride, believe me.
Here are a few tips, and I reason why I suggest them:
1) cutting back your navigation on deeper pages
I just quickly checked how many links is included in your site-wide navigation with Google Spreadsheet:
=ImportXML("http://www.cnmonline.co.uk/Dehumidifiers-c-778.html","//h6/a/@href")
And it got back 142 links. Whoa, thats a lot. And that many links are included in all of your pages, and the navigation is placed BEFORE your content. I had this very same issue with a client, they were hesitating to change the navigation, but eventually it helped them, a lot.
The suggested solution:
- wipe out the drop menu links from deeper pages
- only link to the big categories: "Air Treatment", "Bathroom", ... "Cleaning Products"
- in the category you are in, you can link to subcategories (without any javascript/css drop menu, just simply list them beneath the main category with a different background than darkblue), for example if you are in the Bathroom category, your left navigation will look like:
- Air Treatment
- Bathroom
- Electric Showers
- Mirror Demister
- Bathroom Heaters
- Heated Towel Rails
- Catering Equipment
- ...
- Cleaning Products
So this way you don't have to change a lot in your navigation, and it will make your interlinking more consistent. Furthermore if a user wants to find an another category, there is the search box, the main categories, and the breadcrumb. Which leads to the next suggestion:
2) Make the breadcrumb look like a breadcrumb, not like a tab.
This is just a personal feeling, but now it looks like a tab rather than a breadcrumb. These add up resulting in my feeling: "item1 | item 2 | item3" without underlining the links (so they not looking and feeling like links), and not beginning at the left side of the site, instead next to the left navigation.
Suggested solution:
- move your breadcrumb to the very left side of your site, above your navigation box, you can position it to start from the left side as your navigation box starts (it looks like 15px padding from the left side of the white background)
- the text can be smaller, but make the links underlined, to look like links
- change the pipeline ("|") character with a greater than character (">"), that's much more like a breadcrumb
3) make your pagination link follow, and the pagination pages meta "follow,noindex"
Now at the moment you have nofollowed your pagination links, which results in lower indexation between your product pages than it would possible.
Eg:
- this is cached: http://webcache.googleusercontent.com/search?q=cache:www.cnmonline.co.uk/Bathroom-Products-c-2278.html&hl=en&strip=1
- but the 2nd page isn't: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FBathroom-Products-c-2278-p-2.html
- and whats even worse, but not surprising, this item on the second page isn't indexed: http://webcache.googleusercontent.com/search?q=cache%3Awww.cnmonline.co.uk%2FSavoy-Shawl-Collar-Bath-Robe-Box-of-5-pr-36295.html
Suggested solution:
- let the google bot follow your pagination links, remove the rel nofollow attribute from the links
- make the pagination pages meta robots "follow,noindex"
This change means the google bot can follow your product pages, but won't index those paginated pages. This is awesome, since you don't want to hassle with unique title, description, and the pagination pages are just lists, they don't give any additional value or any reason to be indexed.
Of course if you had pagination issue with reviews, then it would be a whole different story, because then each paginated pages would be valuable, since they are listing valuable user generated content, and not just essentially linking to product pages. So in that case, you might create unique titles and description at least by adding "page X".
4) Your filters aren't causing duplication / canonical issue, since they work on an ajax basis, and they don't create any new url.
So here you shouldn't change anything, but I guess this don't surprise you. You can always check this, by using 'cache:' in google and selecting text-only version, for example: "cache:http://www.cnmonline.co.uk/Bathroom-Heaters-c-2320.html", click text-only version, and you will see that Price Range and Manufacturer have no links which google could follow, so no canonical problem.
Hope this helps.
-
Is it best method to get canonical url redirect with paging to view all pages including all the urls coming with price and sorting filters? any other members would like to share their opinions?
regards
Carl
-
View all! Of course... how did I not think of that before? Thank you.
-
Concerning Pagination,
I would create a "view all" where all the products are listed under this category. then i add rel canonical linking to the "View All " page.
its can help you with your first question and for the issue using filters.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Collapsible sections - content
**Hi,****I am looking to improve the aesthetics of some pages on my website by adding written content into collapsible tabs. I was wondering whether the content that is ‘hidden’ by tabs is given less weight by Google from the perspective of SEO? **Some articles I have read suggest that tabbed content is weighted equally with the content which is already immediately visible to the user, but others suggest that this may not be the case. **Please, can I request opinions on the matter? Any advice would be greatly appreciated, many thanks.**Katarina
Technical SEO | | Katarina-Borovska0 -
Rel=Canonical For Landing Pages
We have PPC landing pages that are also ranking in organic search. We've decided to create new landing pages that have been improved to rank better in natural search. The PPC team however wants to use their original landing pages so we are unable to 301 these pages to the new pages being created. We need to block the old PPC pages from search. Any idea if we can use rel=canonical? The difference between old PPC page and new landing page is much more content to support keyword targeting and provide value to users. Google says it's OK to use rel=canonical if pages are similar but not sure if this applies to us. The old PPC pages have 1 paragraph of content followed by featured products for sale. The new pages have 4-5 paragraphs of content and many more products for sale. The other option would be to add meta noindex to the old PPC landing pages. Curious as to what you guys think. Thanks.
Technical SEO | | SoulSurfer80 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Duplicate Page Content Report
In Crawl Diagnostics Summary, I have 2000 duplicate page content. When I click the link, my Wordpress return "page not found" and I see it's not indexed by Google, and I could not find the issue in Google Webmaster. So where does this link come from?
Technical SEO | | smallwebsite0 -
Canonical efficiency
Hi, I'm creating recommendations for one of my client's site. It's a news site highly based on a regional aspect. One of the main features would be that you can navigate on a high level, we call it inter-regional (with all the regions news) and on the regional level (with only news related to the region) which act as a filter which means that most of my content will be duplicate. To allow the user to navigate the site on the two levels means that all the news pages will be duplicated, one with the inter-regional URL and one with the regional URL. Example: http://www.sitename.com/category/2011/11/07/name-of-the-article http://www.sitename.com/region-name/category/2011/11/07/name-of-the-article The regional URL is the official one, since it has all the keywords I want, and I'm planning to have a canonical on both version with the regional URL. Is there a risk that this would affect my ranking? Any alternatives? I read that I could prevent SE to crawl inter-regional articles using my robot.txt but I'm not fond of that. Thanks!
Technical SEO | | Pherogab0