What is a "good" dwell time?
-
I know there isn't any official documentation from Google about exact number of seconds a user should spend on a site, but does anyone have any case studies that looks at what might be a good "dwell time" to shoot for?
We're looking on integrating an exact time on site into or Google Analytics metrics to count as a 'non-bounce'--so, for example, if a user spends 45 seconds on an article, then, we wouldn't count it as a bounce, since the reader likely read through all the content.
-
I have not seen any studies indicating such a thing,
(but my guess is that is that dwell time seems to be such a strong signal of relevance that google would never release that info, I could be totally wrong though)
An idea to improve UX... if you have a page with 2 paragraphs of text, take the average time it takes for 10 ppl in your office to read it and set the 'bounce rate' accordingly. Then you'll know if ppl are reading it.
If you have a page with 2000 words, avg. that time, etc.
If visitors bounce too soon, edit the text until your office avg. meets visitor avg. That would equal relevance right?
-
The answer to this is really going to be dependent on the page content, Micelleh. A simple page with a clear call to action could result in a user getting exactly what they want from a page within a few seconds and then leaving. A 350 word page might mean 45 seconds, but a 1500 word page might need 2 minutes to prove a user actually got value.
At best, if you insist on a value, get several users to use a good number of your pages, record their on-page time, then create a site-specific average from that.
However, you might be even better off using events for this process, instead of something nebulous like dwell time.
You could add event tracking to the amount of the page a user scrolls, and if they scroll more than half a page (for example), an "interactive" event triggers. "Interactive" events have an effect the same as another pageview (without screwing up your pageview metrics) so a single page visit that scrolled at least half way down the page would no longer be recorded as a bounce.
You could also create interactive events for things like pdf downloads, form submissions, sending emails, viewing a video etc that you consider appropriate for your site to negate what would be considered a bounce.
The biggest benefit to this events-based approach is that it would be vastly more accurate. It would track visitors' actual actions, as opposed to just assuming a dwell time meant a valuable interaction. (For example, we all know that the habit of opening multiple tabs at once for sequential reading significantly over-inflates time on page for many users.)
Perhaps that idea would work better for what you're trying to accomplish?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console validation taking a long time?
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing. Thank you! ^_^
Intermediate & Advanced SEO | | angelamaemae0 -
When "pruning" old content, is it normal to see an drop in Domain Authority on Moz crawl report?
After reading several posts about the benefits of pruning old, irrelevant content, I went through a content audit exercise to kick off the year. The biggest category of changes so far has been to noindex + remove from sitemap a number of blog posts from 2015/2016 (which were very time-specific, i.e. software release details). I assigned many of the old posts a new canonical URL pointing to the parent category. I realize it'd be ideal to point to a more relevant/current blog post, but could this be where I've gone wrong? Another big change was to hide the old posts from the archive pages on the blog. Any advice/experience from anyone doing something similar much appreciated! Would be good to be reassured I'm on the right track and a slight drop is nothing to worry about. 🙂 If anyone is interested in having a look: https://vivaldi.com https://vivaldi.com/blog/snapshots [this is the category where changes have been made, primarily] https://vivaldi.com/blog/snapshots/keyboard-shortcut-editing/ [example of a pruned post]
Intermediate & Advanced SEO | | jonmc1 -
How good or bad is this for SEO?
I will try to make this as clear as possible. We represent the yellow pages - www.visalietuva.lt
Intermediate & Advanced SEO | | FCRMediaLietuva
For every single company that is listed we have Creditworthiness - that helps to find information about their payment history and their business status. It's pretty useful. An example could be found here: http://www.visalietuva.lt/en/company/dizrega-uab/creditworthiness Some companies that are proud of their result started putting Iframe on their pages:
http://dizrega.lt/lt/kontaktai/firmos-rodikliai We noticed this on Google Webmasters, when new links started to appear.
So we are not sure if this is good for SEO? Of course this is good for our Google Analytics:))
If this is good, maybe we should send offer for our clients, that we can help to put iframe like this for free, for people who are not able to do it themselves. Your opinions please!0 -
Why do Local "5 pack" results vary between showing Google+, Google+ and website address
I had a client ask me a good question. When they pull up a search result they show up at the top but only with a link to their G+ page. Other competitors show their web address and G+ page. Why are these results different in the same search group? Is there a way to ensure the web address shows up?
Intermediate & Advanced SEO | | Ron_McCabe0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Looking for a good example of local pages done right
I am looking for a company or two that serves customers in multiple regions and has their site set up in the best possible way to target those areas. I would like, if possible, to see an example of a company that has an address in each area served, and one that only has one base location, but travels to serve customers.
Intermediate & Advanced SEO | | webfeatseo0 -
Getting Google in index but display "parent" pages..
Greetings esteemed SEO experts - I'm hunting for advice: We operate an accommodation listings website. We monetize by listing position in search results, i.e. you pay more to get higher placing in the page. Because of this, while we want individual detailed listing pages to be indexed to get the value of the content, we don't really want them appearing in Google search results. We ideally want the "content value" to be attributed to the parent page - and google to display this as the link in the search results instead of the individual listing. Any ideas on how to achieve this?
Intermediate & Advanced SEO | | AABAB0 -
Google SERPs do not display "cached"
When I am signed in with Google and searching sites, the snippets do not display the "cached" link. Not good since I am trying to see when a particular page was crawled. If I login to another server that I never use to browse and search from there the "cache" link does show up. Assumption: google knows who I am on my machine and is "helping" me.......but is there an easy way to turn this help off?
Intermediate & Advanced SEO | | Eyauuk0