What is a "good" dwell time?
-
I know there isn't any official documentation from Google about exact number of seconds a user should spend on a site, but does anyone have any case studies that looks at what might be a good "dwell time" to shoot for?
We're looking on integrating an exact time on site into or Google Analytics metrics to count as a 'non-bounce'--so, for example, if a user spends 45 seconds on an article, then, we wouldn't count it as a bounce, since the reader likely read through all the content.
-
I have not seen any studies indicating such a thing,
(but my guess is that is that dwell time seems to be such a strong signal of relevance that google would never release that info, I could be totally wrong though)
An idea to improve UX... if you have a page with 2 paragraphs of text, take the average time it takes for 10 ppl in your office to read it and set the 'bounce rate' accordingly. Then you'll know if ppl are reading it.
If you have a page with 2000 words, avg. that time, etc.
If visitors bounce too soon, edit the text until your office avg. meets visitor avg. That would equal relevance right?
-
The answer to this is really going to be dependent on the page content, Micelleh. A simple page with a clear call to action could result in a user getting exactly what they want from a page within a few seconds and then leaving. A 350 word page might mean 45 seconds, but a 1500 word page might need 2 minutes to prove a user actually got value.
At best, if you insist on a value, get several users to use a good number of your pages, record their on-page time, then create a site-specific average from that.
However, you might be even better off using events for this process, instead of something nebulous like dwell time.
You could add event tracking to the amount of the page a user scrolls, and if they scroll more than half a page (for example), an "interactive" event triggers. "Interactive" events have an effect the same as another pageview (without screwing up your pageview metrics) so a single page visit that scrolled at least half way down the page would no longer be recorded as a bounce.
You could also create interactive events for things like pdf downloads, form submissions, sending emails, viewing a video etc that you consider appropriate for your site to negate what would be considered a bounce.
The biggest benefit to this events-based approach is that it would be vastly more accurate. It would track visitors' actual actions, as opposed to just assuming a dwell time meant a valuable interaction. (For example, we all know that the habit of opening multiple tabs at once for sequential reading significantly over-inflates time on page for many users.)
Perhaps that idea would work better for what you're trying to accomplish?
Paul
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the "Homepage" for an International Website With Multiple Languages?
BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
Intermediate & Advanced SEO | | mirabile
/en-us, /de, etc....
2. Hreflang tags
3. Universal footer links so user can select their preferred language.
and
4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /. Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing. In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories. I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts? Thanks for any tips/feedback!2 -
Site appears with ".com" but not without it
Hi, When I search for my site www.docslinc.com as "docslinc.com" the results on the SERPS have the home page and the site map but not the other indexed pages. The other issue occurs when I search for the company name alone "docslinc", the homepage does not show up at all, and some of the other pages show up. I have looked all over the place and cannot find an answer. I have checked the onsite optimization and it all seems to be correct. Any suggestions would be amazing. Thanks, zulumanf
Intermediate & Advanced SEO | | zulumanf0 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
Use "If-Modified-Since HTTP header"
I´m working on a online brazilian marketplace ( looks like etsy in US) and we have a huge amount of pages... I´ve been studing a lot about that and I was wondering to use If-Modified-Since so Googlebot could check if the pages have been updated, and if it is not, there is no reason to get a new copy of them since it already has a current copy in the index. It uses a 304 status code, "and If a search engine crawler sees a web page status code of 304 it knows that web page has not been updated and does not need to be accessed again." Someone quoted before me**Since Google spiders billions of pages, there is no real need to use their resources or mine to look at a webpage that has not changed. For very large websites, the crawling process of search engine spiders can consume lots of bandwidth and result in extra cost and Googlebot could spend more time in pages actually changed or new stuff!**However, I´ve checked Amazon, Rakuten, Etsy and few others competitors and no one use it! I´d love to know what you folks think about it 🙂
Intermediate & Advanced SEO | | SeoMartin10 -
Webmaster Tools "Not found" errors after sitemap update
Hello Mozzers - I found a sitemap with loads of URL errors on it (none of the URLs on sitemap actually existed) so I went ahead and updated sitemap - now I'm seeing a spike in "not found" errors in WMT - is this normal / anything to worry about when you significantly change a sitemap. I've never replaced every URL on a sitemap before! L
Intermediate & Advanced SEO | | McTaggart0 -
Google Reconsideration - Denied for the Third Time
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started. Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines." So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response! I don't know what else to do? I did everything i could think of with the exception of deleting the whole site. Any advice would be greatly appreciated. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
What do do when sidebar is causing "Too Many On-Page Links" error
I have been going through all the errors, warnings from my weekly SEO Moz scans. One thing I'm see a bit of is "Too Many On-Page Links". I've only seen a few, but as in the case of this one: http://blog.mexpro.com/5-kid-friendly-cancun-mexico-resorts there is only 2 links on the page (the image and the read more). So I think the sidebar links are causing the error. I feel my tags are important to help readers find information they may be looking for. Is there a better method to present tags than the wordpress tag cloud? Should I exclude the tags, with the risk of making things more difficult for my users? Thanks for your help.
Intermediate & Advanced SEO | | RoxBrock0 -
Are paid directories any good?
Hi Everyone, I have have listing our site in free directories. Are there any paid directories that are worth listing in? Our company and site is based in New Zealand. Thanks, Pete
Intermediate & Advanced SEO | | Hardley10