Pagination question
-
I have a website http://www.example.com with pagination series starting with page1.html upto page10.html.
With backlinks to some of the pages ( page1.html, page2.html----page7.html). If i include rel="next" and rel="prev" on page1.html to page10.html pages.
Will value of those links will be transfered to http://www.example.com
This is what i interpret from http://bit.ly/mUOrn2
Am i right ?
-
Rel=prev/next is still pretty new, so we don't have a lot of data, but it seems to work like a canonical tag. It should pass link-juice up the chain. That said, it's pretty rare for "page X" or search results (where X > 1) to have inbound links or much in the way of search value. I think cleaning up pagination can help a lot, if it's a big chunk of your search index.
-
If your goal is to send most people to the page1.html content, then yes, adding rel="next" and rel="prev" would be a good solution. But, if you want people reaching different content on each page, then it might not be the effect that you're looking for. Typically I see it used for search results and archive/category/tag pages.
Is there a reason you're paginating all that content manually? Sometimes a longer page can be an asset...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
General SSL Questions After Move
Hello, We have moved our site to https, Google Analytics seems to be tracking correctly. However, I have seen some conflicting information, should I create a new view in analytics? Additionally, should I also create a new https property in Google search console and set it as the preferred domain? If so, should I keep the old sitemap for my http property while updating the sitemap to https only for the https property? Thirdly, should I create a new property as well as new sitemaps in Bing webmaster? Finally, after doing a crawl on our http domain which has a 301 to https, the crawl stopped after the redirect, is this a result of using a free crawling tool or will bots not be able to crawl my site after this redirect? Thanks for all the help in advance, I know there are a lot of questions here.
Technical SEO | | Tom3_150 -
302 Redirect Question
After running a site crawl. I found two 302 redirects. The two redirects go from: site.com to www.site.com & site.com/products to www.site.com/products How do I fix the 302 redirect and change it to a 301 redirect? I have no clue where to start. Thanks.
Technical SEO | | Ryan_1320 -
301 Redirect Question
I am working with a website and I ran a Screaming Frog and noticed there are 4,600 301's on the website (www.srishoes.com). It seems like the issue is between the www. and without it and they aren't working together. Is this something that the website provider should update and what type of impact might this have on the site? Thanks!
Technical SEO | | ReunionMarketing
Matt0 -
Launching large content project - date-stamp question
Hello mozzers! So my company is about to launch a large scale content project with over 100 pieces of newly published content. I'm being asked what the date-stamp for each article should be. Two questions:
Technical SEO | | Vacatia_SEO
1- Does it hurt article's SEO juice to have a lot of content with the same "published on" date?
2- I have the ability to manually update each articles date stamp. Is there a recommended best practice? p.s. Google has not crawled any of these pages yet.1 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Schema address question
I have a website that has a contact us page... of course and on that page I have schema info pointing out the address and a few other points of data. I also have the address to the business location in the footer on every page. Would it be wiser to point to the schema address data on the footer instead of the contact page? And are there any best practices when it comes down to how many times you can point to the same data, and on which pages? So should I have schema address on the contact us page and the footer of that page, that would be twice, which could seem spammy. Haven't been able to find much best practices info on schema out there. Thanks, Cy
Technical SEO | | Nola5040 -
Pagination and SEO: How do I fix it during search parameters?
Today, I have watched very interesting video on YouTube about Pagination and SEO. I have implemented pagination with rel="next" and rel="prev" on my paginated page. You can get more idea by visit following pages. www.vistastores.com/patio-umbrellas www.vistastores.com/patio-umbrellas?p=2 www.vistastores.com/patio-umbrellas?p=3 I have added NOINDEX FOLLOW attribute to page 2, page 3 and so on. There is simple question from my side. Can I remove NOINDEX FOLLOW attribute from paginated page or not? I have big confusion & issues when paginated URLs contain search parameters. You can get more idea by visiting following URLs. http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=2 http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=3 What is best suggestion for this kind of pages?
Technical SEO | | CommercePundit0 -
Pages not ranking - Linkbuilding Question
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Technical SEO | | Impact-201555
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are: Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that? Or has it just not been enough time? Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that? Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing? The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new. Would linking to pages that are not ranking well be a good idea? Would it help them to get indexed / ranking? Or would it be better to link to the pages that are already indexed / ranking? If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site0