Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
-
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy?
Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice?
I would love feedback on if this is a proper method/strategy to keep Google happy.
Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
-
Firstly, read http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284 for the basics on addressing this problem. It was noted in the other response but it's key that you approach it this way. Its common but easily fixable.
On your other note, robots read everything on the page, content included. They may not index any of it (considering it's on a NOINDEX page), but the absolutely read and crawl everything. And yes, naturally they follow the links on a FOLLOW page. They won't on a NOFOLLOW and will look elsewhere for links to follow.
Hope this answered your question. Let me know if not.
-
Can someone respond to the questions on my post? Thanks.
-
Use rel next prev and optionally if worried about pages 2-N coming up in SERPs add noindex meta tag to those pages
http://searchengineland.com/google-provides-new-options-for-paginated-content-92906
http://searchengineland.com/the-latest-greatest-on-seo-pagination-114284
http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970
http://www.youtube.com/watch?v=njn8uXTWiGg
Why you would not want to use canonical - it works but not the proper use of the tag.
http://searchengineland.com/pagination-strategies-in-the-real-world-81204
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Chinese search engine indexation
Hello, I have read that it is vital for a site to be indexed in Chinese search engines that it needs to be hosted in China on a server with a Chinese IP address, is this true? The site in question is a .cn site, hosted in USA currently, but served via CloudFlare (which has locations in China). Any advice on how to rank a Chinese site would be greatly appreciated, including if you know anyone who I can hire to create a Chinese sitemap file to submit to Chinese search engines (and even optimise the site). Many thanks,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Webiste Ranking Differently Based on IP/Data Center
I have a site which I thought was ranking well, however that doesn't seem to be the case. When I check the site from different IPs within the US it shows that the site is on page 1 and on other IPs it shows that it's on page 5 and for some keywords it shows it's not listed. This site was ranking well, before but I think google dropped it when I was giving putting in too much work with it (articles and press releases), but now it seems to have recovered when I check with my IP, but on other data centers it still shows it prior to recovering. It was able to recover after not building links to for a period of time, it showed it moved back up from the data center I'm connected to, but it still shows the possibly penalized results on other data centers. Is it possible that site is still penalized? So the question is why does it show it recovered in some data centers and not others? How do I fix this? It's been about 2 months since it's recovered from some data centers. Is this site still penalized or what's going on? There are no warnings in web master tools. Any insights would be appreciated! This isn't an issue with the rank tracking software, I've tested this on a multitude of IPs with varying differences. Thanks!
White Hat / Black Hat SEO | | seomozzy0 -
Are these links bad for my results?
In the past we have requested links on multiple directories. Since we have seen a mayor drop (60% in traffic) in results around the pinquin update 24-26th of April. Our results have been slowly getting lower and lower in Google. Is it possible to tell if these links are in fact doing my site harm? Before the 26th of April it was easy to see that the results where benefiting from the submission to those directories. We did not have any messages in webmaster tools and reconsideration says "no manual spam action taken". What would be the best strategy to turn this around and go up again? A selection of the requested links can be found below. <colgroup><col width="266"></colgroup>
White Hat / Black Hat SEO | | 2Hillz
| www.thesquat.org |
| www.directmylink.com |
| www.thegreatdirectory.org |
| www.submission4u.com |
| www.urlmoz.com |
| www.basoti.org |
| www.iwebdirectory.co.uk |
| www.freeinternetwebdirectory.com |
| addsite-submitfree.com |
| opendirectorys.com |
| www.xennobb.com |
| mdwerks.com |
| www.directoryfire.com |
| www.rssbuffet.com | To give a good view on the problem: The requested links anchors are mostly not in the native language of the directories. Thanks!0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0