When Bounce Rate, Browse Rate (PPV), and Time-on-Site Are Useful Metrics... and When They Aren't
The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
When is it right to use metrics like bounce rate, pages per visit, and time on site? When are you better off ignoring them? There are endless opinions on whether these kinds of metrics are valuable or not, and as you might suspect, the answer is found in the shades of grey. Learn what Rand has to say about the great metrics debate in today's episode of Whiteboard Friday.
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about times at which bounce rate, browse rate, which is pages per visit, and time on site are terrible metrics and when they're actually quite useful metrics.
This happens quite a bit. I see in the digital marketing world people talking about these metrics as though they are either dirty-scum, bottom-of-the-barrel metrics that no one should pay any attention to, or that they are these lofty, perfect metrics that are what we should be optimizing for. Neither of those is really accurate. As is often the case, the truth usually lies somewhere in between.
So, first off, some credit to Wil Reynolds, who brought this up during a discussion that I had with him at Siege Media's offices, an interview that Ross Hudgens put together with us, and Sayf Sharif from Seer Interactive, their Director of Analytics, who left an awesome comment about this discussion on the LinkedIn post of that video. We'll link to those in this Whiteboard Friday.
So Sayf and Wil were both basically arguing that these are kind of crap metrics. We don't trust them. We don't use them a lot. I think, a lot of the time, that makes sense.
Instances when these metrics aren't useful
Here's when these metrics, that bounce rate, pages per visit, and time on site kind of suck.
1. When they're used instead of conversion actions to represent "success"
So they suck when you use them instead of conversion actions. So a conversion is someone took an action that I wanted on my website. They filled in a form. They purchased a product. They put in their credit card. Whatever it is, they got to a page that I wanted them to get to.
Bounce rate is basically the average percent of people who landed on a page and then left your website, not to continue on any other page on that site after visiting that page.
Pages per visit is essentially exactly what it sounds like, the average number of pages per visit for people who landed on that particular page. So people who came in through one of these pages, how many pages did they visit on my site.
Then time on site is essentially a very raw and rough metric. If I leave my computer to use the restroom or I basically switch to another tab or close my browser, it's not necessarily the case that time on site ends right then. So this metric has a lot of imperfections. Now, averaged over time, it can still be directionally interesting.
But when you use these instead of conversion actions, which is what we all should be optimizing for ultimately, you can definitely get into some suckage with these metrics.
2. When they're compared against non-relevant "competitors" and other sites
When you compare them against non-relevant competitors, so when you compare, for example, a product-focused, purchase-focused site against a media-focused site, you're going to get big differences. First off, if your pages per visit look like a media site's pages per visit and you're product-focused, that is crazy. Either the media site is terrible or you're doing something absolutely amazing in terms of keeping people's attention and energy.
Time on site is a little bit misleading in this case too, because if you look at the time on site, again, of a media property or a news-focused, content-focused site versus one that's very e-commerce focused, you're going to get vastly different things. Amazon probably wants your time on site to be pretty small. Dell wants your time on site to be pretty small. Get through the purchase process, find the computer you want, buy it, get out of here. If you're taking 10 minutes to do that or 20 minutes to do that instead of 5, we've failed. We haven't provided a good enough experience to get you quickly through the purchase funnel. That can certainly be the case. So there can be warring priorities inside even one of these metrics.
3. When they're not considered over time or with traffic sources factored in
Third, you get some suckage when they are not considered over time or against the traffic sources that brought them in. For example, if someone visits a web page via a Twitter link, chances are really good, really, really good, especially on mobile, that they're going to have a high bounce rate, a low number of pages per visit, and a low time on site. That's just how Twitter behavior is. Facebook is quite similar.
Now, if they've come via a Google search, an informational Google search and they've clicked on an organic listing, you should see just the reverse. You should see a relatively good bounce rate. You should see a relatively good pages per visit, well, a relatively higher pages per visit, a relatively higher time on site.
Instances when these metrics are useful
1. When they're used as diagnostics for the conversion funnel
So there's complexity inside these metrics for sure. What we should be using them for, when these metrics are truly useful is when they are used as a diagnostic. So when you look at a conversion funnel and you see, okay, our conversion funnel looks like this, people come in through the homepage or through our blog or news sections, they eventually, we hope, make it to our product page, our pricing page, and our conversion page.
We have these metrics for all of these. When we make changes to some of these, significant changes, minor changes, we don't just look at how conversion performs. We also look at whether things like time on site shrank or whether people had fewer pages per visit or whether they had a higher bounce rate from some of these sections.
So perhaps, for example, we changed our pricing and we actually saw that people spent less time on the pricing page and had about the same number of pages per visit and about the same bounce rate from the pricing page. At the same time, we saw conversions dip a little bit.
Should we intuit that pricing negatively affected our conversion rate? Well, perhaps not. Perhaps we should look and see if there were other changes made or if our traffic sources were in there, because it looks like, given that bounce rate didn't increase, given that pages per visit didn't really change, given that time on site actually went down a little bit, it seems like people are making it just fine through the pricing page. They're making it just fine from this pricing page to the conversion page, so let's look at something else.
This is the type of diagnostics that you can do when you have metrics at these levels. If you've seen a dip in conversions or a rise, this is exactly the kind of dig into the data that smart, savvy digital marketers should and can be doing, and I think it's a powerful, useful tool to be able to form hypotheses based on what happens.
So again, another example, did we change this product page? We saw pages per visit shrink and time on site shrink. Did it affect conversion rate? If it didn't, but then we see that we're getting fewer engaged visitors, and so now we can't do as much retargeting and we're losing email signups, maybe this did have a negative effect and we should go back to the other one, even if conversion rate itself didn't seem to take a particular hit in this case.
2. When they're compared over time to see if internal changes or external forces shifted behavior
Second useful way to apply these metrics is compared over time to see if your internal changes or some external forces shifted behavior. For example, we can look at the engagement rate on the blog. The blog is tough to generate as a conversion event. We could maybe look at subscriptions, but in general, pages per visit is a nice one for the blog. It tells us whether people make it past the page they landed on and into deeper sections, stick around our site, check out what we do.
So if we see that it had a dramatic fall down here in April and that was when we installed a new author and now they're sort of recovering, we can say, "Oh, yeah, you know what? That takes a little while for a new blog author to kind of come up to speed. We're going to give them time," or, "Hey, we should interject here. We need to jump in and try and fix whatever is going on."
3. When they're benchmarked versus relevant industry competitors
Third and final useful case is when you benchmark versus truly relevant industry competitors. So if you have a direct competitor, very similar focus to you, product-focused in this case with a homepage and then some content sections and then a very focused product checkout, you could look at you versus them and their homepage and your homepage.
If you could get the data from a source like SimilarWeb or Jumpshot, if there's enough clickstream level data, or some savvy industry surveys that collect this information, and you see that you're significantly higher, you might then take a look at what are they doing that we're not doing. Maybe we should use them when we do our user research and say, "Hey, what's compelling to you about this that maybe is missing here?"
Otherwise, a lot of the time people will take direct competitors and say, "Hey, let's look at what our competition is doing and we'll consider that best practice." But if you haven't looked at how they're performing, how people are getting through, whether they're engaging, whether they're spending time on that site, whether they're making it through their different pages, you don't know if they actually are best practices or whether you're about to follow a laggard's example and potentially hurt yourself.
So definitely a complex topic, definitely many, many different things that go into the uses of these metrics, and there are some bad and good ways to use them. I agree with Sayf and with Wil, but I think there are also some great ways to apply them. I would love to hear from you if you've got examples of those down in the comments. We'll see you again next week for another edition of Whiteboard Friday. Take care.
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.