Ideas on why Pages Per Visit Dropped?
-
Week over week our pages per visit continue to drop. Any ideas on where to look to diagnose?
-
Thanks for your candor, EGOL. Was just looking for some insight on where I might look outside of the obvious markers I'd already evaluated.
These forums have been been extremely helpful in the past when folks have encountered similar issues before mine.
Have a great day!
-
We're a news media site,
This is some of the most volatile stuff on the web.
Really... if you have concerns and don't know what's going on it would be best to hire a pro to review the site instead of chasing shots in the dark from people who have never seen the site.
Your effort and theirs could be divining for a problem that does not exist.
-
Thanks, Tom. I'm quite puzzled because I don't see anything that would indicate why organic search pages per visit have dropped 7%.
404s and server errors are fairly stable. We did see a spike last Monday from spam that we have since fixed, so not sure if that would translate to organic.
Our organic traffic was growing in the teens week over week, and in the past few weeks, we are down by 2% or flat. I know this is kind of a micro look, but I try to stay on top of things as much as possible.
And now the pages per visit are down two weeks in a row. We're a news media site, so I'm trying to see if I can see a pattern, but can't seem to find one.
Thanks for your input.
If any other ideas pop into your head about pages per visit or ideas on stabilizing traffic, I'd be grateful.
Thanks,
Lisa
-
Hi Aggie
This really could be a number of things, but there's a few places to start looking.
First, I would jump into your webmaster tools and see if you have had a spike in errors, which you can find in the traffic tab. If the Googlebot is seeing 404 or server errors, your users might be getting the same thing. Look for any spikes in those graphs and see if there is anything that might put a user off.
Next, I would look at segmenting your analytics data. Are you looking at all of your traffic as a whole? It may be worthwhile to look at the data over time in segments, such as the non-paid search, referral, and paid search segments Google analytics provides. It may be that there is one particular traffic channel that is driving the "bad" traffic - such as PPC ads that aren't converting or referral traffic from a new partner. For instance, one of my clients places a banner on an affiliated website with its own tracking URL. We saw traffic increase over a number of weeks, but bounce rate, dwell time and pages per visit fell and it turned out that the users referred from the banner were, by in large, bouncing straight off. After running it for a bit longer, we decided to remove the banner as it wasn't driving relevant traffic. If you see anything unnatural in these segments, you might be able to identify the cause.
Similarly, have you started ranking for any "educational" terms lately? By that I mean, have you started ranking for terms like "what is [keyword]" or "[keyword] explained"? If so, you may be getting traffic for the terms, but because your landing pages have served their purpose, users are arriving, reading, and leaving. Not a lot you can do in this case to improve pages per visit, but you could be able to capture some lead data by adding an email signup to those landing pages.
You should also have a look at your user journeys in Google analytics, which you can find if you go to Audience -> Visitors Flow in the left hand side navigation. See if you can see a pattern here - are there particularly pages that users are going to that has seen a jump in exit rate over time? Look at those pages - has any HTML/javascript code gone wrong? Is anything broken on those pages? Are they, in fact, just low quality pages that you could improve? This might also reveal something.
Don't be too worried though if you can't see anything standing out. Sometimes these things happen and recover - other times user behaviour simply changes. These methods above might be able to point out any faults you need to address or any improvements you could make. But if your traffic and conversions are steady, pages per visit decreasing isn't such a bad thing. In fact, if conversions are steady (how ever you measure them), one could argue that users taking a shorter time (and journey) to convert show that they're trusting your brand and website more to make the decision earlier.
Hope this helps Aggie!
-
Irving-
404 pages seem to be stable as do referring sites, and no navigation updates. I'm not sure about the GA tagging. I did not implement the tagging, so I wouldn't know where to look to find out.
Any guidance would be much appreciated.
Thanks,
Lisa
-
Did you update any navigation lately?
Are pages still properly tagged with analytics?
Have you seen a spike in 404 pages?
Are visitors coming from the same places, or are you getting unqualified traffic?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content on Product Pages with Canonical Tags
Hi, I'm an SEO Intern for a third party wine delivery company and I'm trying to fix the following issue with the site regarding duplicate content on our product pages: Just to give you a picture of what I'm dealing with, the duplicate product pages that are being flagged have URLs that have different Geo-variations and Product-Key Variations. This is what Moz's Site Crawler is seeing as Duplicate content for the URL www.example.com/wines/dry-red/: www.example.com/wines/dry-red/_/N-g123456 www.example.com/wines/dry-red/_/N-g456789 www.example.com/wines/California/_/N-0 We have loads of product pages with dozens of duplicate content and I'm coming to the conclusion that its the product keys that are confusing google. So we had the web development team put the canonical tag on the pages but still they were being flagged by google. I checked the of the pages and found that all the pages that had 2 canonical tags I understand we should only have one canonical tag in the so I wanted to know if I could just easily remove the second canonical tag and will it solve the duplicate content issue we're currently having? Any suggestions? Thanks -Drew
Algorithm Updates | | drewstorys0 -
Anyone else notice a traffic drop around June 28th 2016?
Hey guys, i know there is a lot of chatter about a potential google update on / around June 28th... well, one of the sites I manage was hit pretty bad. I have attached an image showing what our ranking looked like before and after the unconfirmed update. As you will see, we plummeted that day and have not yet recovered. Has anyone else experienced drops? I'm trying to identify potential causes. I've started an SEO audit and so far I have identified 1 big fish and a few niggly bits. Big Fish: non-www version of site had redirect rule overwritten by CloudFlare page rules and is using a 302 redirect instead of a 301. Now could this be the cause? Niggly bits: technical SEO issues and optimisations like having unique meta titles etc. If you have noticed drops or think you can shed some light on my situation that will help me then I would love to hear from you 🙂 Thanks upload.png upload.png
Algorithm Updates | | OptiBacUK0 -
Drop in Organic Traffic
Hi all, last Thursday (1/31) our organic traffic and conversions fell off the map, going from 15% of our traffic to just over 5%. We've started creeping back up, we were nearly 7% yesterday, but I'm wondering if anyone else experienced an extreme drop in traffic or any advice on what we should be doing next. We are currently building links from University and government organizations and are always creating fresh content on our blog and website pages. One thing that we thought of is this timeline corresponds with when we created a Google + local listing for our company. Is there any possibility we stopped appearing in as many national search results since we have a local listing? Our domain is www.dishcostsless.com. Any advice would be very helpful. Thank!
Algorithm Updates | | PlanetDISH0 -
Page 2 to page 1
I've found a lot of times it does not take much activity to get a keyword from ranking on page 3 of Google or further down to page 2 but there seems to be a hurdle from page 2 to page 1. It is very frustrating to be between 11 and 15 but not being able to make that push to 9 or 10. Has anyone got or seen any data to justifiy this?
Algorithm Updates | | S_Curtis0 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Rankings drop, but only one term
Has anyone experienced a significant drop on just one term? I have about 10 terms that I am constantly monitoring and 9 out of 10 are unchanged or even improved, but 1 term was #1 and is now #11. Curious if anyone has experienced anything similar. I originally thought it was "Panda", but why wouldn't all my terms be affected??
Algorithm Updates | | MNKid150 -
If a page one result for a keyword is mostly directories, do I have a chance to rank for this keyword?
I feel like although directories carry a lot of weight and links, I'd think that my client would be able to gain a top position, since none of the others are competitor pages, nor are the directories engaging.
Algorithm Updates | | randallseo0