Please help me articulate why broken pagination is bad for SEO...
-
Hi fellow Mozzers.
I am in need of assistance. Pagination is and has been broken on the Website for which I do SEO in-house...and it's been broken for years.
Here is an example: http://www.ccisolutions.com/StoreFront/category/audio-technica
This category has 122 products, broken down to display 24 at a time across paginated results. However, you will notice that once you enter pagination, all of the URLs become this: http://www.ccisolutions.com/StoreFront/IAFDispatcher
Even if you hit "Previous" or "Next" or your browser back button, the URL stays: http://www.ccisolutions.com/StoreFront/IAFDispatcher
I have tried to explain to stakeholders that this is a lost opportunity. That if a user or Google were to find that a particular paginated result contained a unique combination of products that might be more relevant to a searcher's search than the main page in the series, Google couldn't send the searcher to that page because it didn't have a unique URL. In addition, this non-unique URL most likely is bottle-necking the flow of page authority internally because it isn't unique. This is not to mention that 38% of our traffic in Google Analytics is being reported as coming from this page...a problem because this page could be one of several hundred on the site and we have no idea which one a visitor was actually looking at.
How do I articulate the magnitude of this problem for SEO? Is there a way I can easily put it in dollars and cents for a business person who really thinks SEOs are a bunch of snake oil salesmen in the first place?
Does anyone have any before and after case studies or quantifiable data that they would be willing to share with me (even privately) that can help me articulate better how important it is to address this problem. Even more, what can we hope to get out of fixing it? More traffic, more revenue, higher conversions?
Can anyone help me go to the mat with a solid argument as to why pagination should be addressed?
-
Thanks so much Gianluca for this thoughtful and valuable advice.
Yes, page load speed is definitely something that's been a concern. This is why we went back to 24 products displayed per page instead of 50 a few months ago. However, since then we've made some significant improvements in page load times and we think we can probably go up to 100 products per page and still be fairly fast. We will have to test.
On the up side, we only have 7 categories with more than 100 products, and only 24 with more than 50. The biggest problem we have effecting speed isn't so much the images. It's the fact that the website does real-time pricing calls on every product to ou business back end every time the page loads. This may be a sticking point.
I have also thought about the canonical tag problem. Of course, it's a problem now too, but if the "View All" page just ends up getting that generic URL and no proper canonical tag...then we really are back to square one.
The possibility of no-indexing all of the categories that are related to paginated series is something that crossed my mind yesterday, so it's interesting that you mentioned that. While it would solve certain issues, wouldn't this be a problem in terms of having valuable content in Google? Granted, some of our category pages are purely there for navigation purposes, in which case, I suppose there's no harm in no-indexing them. However, with the roll-out of Hummingbird I began looking at our category pages as valuable opportunities for "topics" pages that could act as a hub for visitors searching for products or information around specific uses or brands.
Wouldn't there be a significant risk in losing valuable market share for key terms by removing so many category pages from Google's index?
If I am understanding your last suggestion you are saying to have the page default to "View All" and noindex everything else...You are right, not a great scenario, but you are also right in that this may be the only solution given management's steadfast stance on not wanting to pay to fix it.
Lot's to think about, but your comment has been extremely helpful. Thanks again!
-
Dana,
just few tips about the view all option.
While it surely is the best solution, even when a real pagination exists, you should always remember few things:
- a view all list with tens of snippets (photo + text + link) can be like a block of reinforced concrete for the PageSpeed of your site: imagine those listings with 100+ products.
In that case using a view all can be not the correct solution, because googlebot won't ever be able to go through all the code and give up following all the URLs present in the view all page.
-
in fact, the ideal should be having a view all page uploading completely within 4 seconds
-
for that reason, if the only solution you have is having a view all page, then you should seriously thinking in implementing the lazy loading for the images, so that the written content (links included) will have priority in rendering and Google will see them all, and images are uploaded only when needed (i.e.: when the users, scrolling down, arrives to the image that must appear).
Then, there's doubt - a big one: if the paginated list always have this URL http://www.ccisolutions.com/StoreFront/IAFDispatcher, how can you put as its canonical the view all of http://www.ccisolutions.com/StoreFront/category/audio-adapters-audio-connectors when it should have also as canonical http://www.ccisolutions.com/StoreFront/category/audio-technica?
Maybe the only solution you have is this:
-
forcing that the view all URL is the default one;
-
all the paginated pages (also the the first page) are noindex
Not really a wonderful solution, but - from what I understood about the stubborness of your bosses - the only one. But one that must be executed properly in order to avoid worse issues.
-
....is like hiring an astronaut, handing them a box of toothpicks and some gunpowder and saying you expect them to land on the moon
ha ha ha... that is really funny.
Thanks for the laugh.
-
Thanks so much EGOL. I always love your candor.
Believe me, when I went home last night to ponder solutions to this problem, everything you mentioned crossed my mind. It was a thoroughly frustrating conversation to have. It simply amazes me that Google can tell the world very clearly all the things that will help their sites do better in the SERPs, yet people continue to ignore all of that advice, do what they want (or whatever is "easy" or cheap), and then whine about why their sites aren't doing well.
Making the commitment to hire an in-house SEO without equipping them with good tools and refusing to take their advice is like hiring an astronaut, handing them a box of toothpicks and some gunpowder and saying you expect them to land on the moon.
-
Thanks so much Andy. Agreed on all points. I think I have convinced the powers that be that at the very least we should add a "View All" option. This would give both end-users and Google a useful means to access all of the products in a category at once, without having to resort to pagination if they didn't want to. It is something we can add fairly easily and at little to no cost. Since only 8 of our category pages have more than 100 products, and none go higher than 200, this seems like a very reasonable compromise, at least for now.
I very much appreciate you taking the time to respond
It was a frustrating day and a frustrating conversation to have to have.
-
I don't have an answer for you... but I will say that it would really bother me that I would have to jump through hoops with a pogo stick to get stakeholders to want to address this.
I'll skip my rant and get right to the analysis.....
What's going on? Are these stakeholders: A) dumb? B) lazy? C) short of resources? D) frying bigger fish?
If it is A or B then I am probably looking for another job before the company goes bankrupt.
If it is D then I might decide if I should resign and go into competition with them to cash in on the bonanza.
If it is C then you have a dilema that could involve going to the stakeholders boss, other creative solutions or looking for a new job.
Really, you should not have to ask this question.
-
Hi Dana,
I can certainly understand your problem, and whilst I have no data to give you, you should certainly be looking at this not only as a lost opportunity from and SEO perspective, but also as the inability to report back just how well the site is converting traffic. Without this data, no site can see where changes can be made and where improvements will result to an increase in revenue.
I would also look at the fact that anything that is broken on a site might not be having an observable negative issue right now, but what happens with the next algorithm update? Will something be spotted at some point? Do you want to wait for Google to penalise the site before realising it should have been corrected?
Also, does it make for a poor user experience? If someone comes to the site and then bookmarks of of these pages, how are they going to get back again? Are they then likely to just navigate away because they didn't land where they intended.
I am sure there will be a loss in revenue from this - quantifying it will be difficult for an outsider though. There is no doubt that this should be resolved, and I would say ASAP as well.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is managed wordpress hosting bad for seo?
hi, i would like to create my own website, but I am confused either to choose cpanel hosting or managed wordpress
Web Design | | alan-shultis0 -
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Does an age verification home page hurt SEO?
There's a microbrewery in our area that just launched its first website. It has the "verify your age" homepage (which is not really their homepage, but I don't know what it's called) before you can enter. It looks like this: http://angrychairbrewing.com/ Anyway, does this hurt them at all from a rankings standpoint? Also, assuming bots/spiders/ROGER can crawl sites like this, (which I think they would have to be able to do) how do they get around this verification? Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Changing design for a client. SEO concerns.
Hi there! A client requested me to change the look of his website entirely. It currently ranks #16 on Google with one of their main keywords. My problem is: The current site was made in a CMS I'm not familiar with and all of its pages urls are not SEO friendly (EX: http://www.mysite.com/index.php?option=com_content&view=article&id=49&Itemid=95). It is the first time I have come up with this situation so I would appreciate any tips or links to useful information. I tried searching in SEOmoz and came up with nothing. I'm sure this is a common problem though. Since they want a static website, for starters their page extensions will change from .php to .html I'm not 100% sure but I think this will be a problem for their current ranking in Google. Any ideas? Edit: I forgot to mention that all of the backlinks this site has points to their hompage as www.mysite.com, I guess this is good.
Web Design | | Eblan0 -
Just How Bad is Adobe MUSE for SEO?
Adobe's new website builder "Adobe Muse" has a reputation for creating terrible code. I want to know if anyone has experience with the software and what your opinion is on just how bad the code really is for SEO. I'm currently using "Weebly", which is a similar, but more basic website building software. My results for SEO have been going well using that software, however it's limited in terms of building an aesthetically pleasing website design. On the other hand, you can build gorgeous websites with Adobe Muse, but I don't want to use it if it's going to prevent me from ranking. What are your thoughts?
Web Design | | Alchemist230 -
Having a new website build, what happens to my SEO work?
Hi, We are in the process of having a new website developed by a web agency. New domain, new business name etc... What is the best process to transfer from old domain to the new one? What happens to all my ranked URLs? What can re-directs do to help me? Will I loose all my link juice on highly ranked keyword domains? I ranked 1st for a brand now, wil lthis go? Any advice, help or hints would be great. Thanks Will
Web Design | | YNWA0 -
SEO downsides to minimalist (copy-light) homepage?
Curious for your thoughts on this - are there any SEO downsides to not having any substantive content on the home page (big background design)? We would obviously have appropriate page titles and link structure, etc. Our guess is that if the home page doesn't have much copy, that odds are that other specific pages will tend to perform better for non-brand search terms, which seems OK. If people DO find the homepage, it would likely be a brand search or an ad referral, in which case the minimalist, non-copy design would be conversion-friendly. Does that theory hold any water? I suppose a middle ground might be a single H1 line unobtrusively on the page. Thanks in advance for any insight, guys! Sincerely, Stephen
Web Design | | PerfectPitchConcepts0 -
Is my company's privacy policy diluting our SEO efforts?
Good morning! I'm new to the SEOmoz community. This morning, I spot checked a couple pages using the Term Extractor. When looking at the results, I noticed that we're ranking for many of the terms contained in our privacy policy. Our privacy policy is set up in the footer of our page templates and appears as a light box that pops up over the page you're viewing, so it looks like every page (from a search engines perspective) contains every word of our 900-word privacy policy. Since several of our legal terms are showing up as "targeted terms" within the tool, would it benefit me to change the privacy policy link from a light box to something else? Perhaps a link to a static page that contains our privacy policy instead? Are the search engines smart enough to see the repeating text and ignore it from page to page, or am I just diluting all of my SEO efforts here? I'm after big wins here, and don't want to be too nitpicky, but concerned this could be a big SEO no-no that I might want to correct. Thanks in advance for your expertise! Ben Culbert
Web Design | | SheriGolla0