What is a "good" dwell time?
-
I know there isn't any official documentation from Google about exact number of seconds a user should spend on a site, but does anyone have any case studies that looks at what might be a good "dwell time" to shoot for?
We're looking on integrating an exact time on site into or Google Analytics metrics to count as a 'non-bounce'--so, for example, if a user spends 45 seconds on an article, then, we wouldn't count it as a bounce, since the reader likely read through all the content.
-
I have not seen any studies indicating such a thing,
(but my guess is that is that dwell time seems to be such a strong signal of relevance that google would never release that info, I could be totally wrong though)
An idea to improve UX... if you have a page with 2 paragraphs of text, take the average time it takes for 10 ppl in your office to read it and set the 'bounce rate' accordingly. Then you'll know if ppl are reading it.
If you have a page with 2000 words, avg. that time, etc.
If visitors bounce too soon, edit the text until your office avg. meets visitor avg. That would equal relevance right?
-
The answer to this is really going to be dependent on the page content, Micelleh. A simple page with a clear call to action could result in a user getting exactly what they want from a page within a few seconds and then leaving. A 350 word page might mean 45 seconds, but a 1500 word page might need 2 minutes to prove a user actually got value.
At best, if you insist on a value, get several users to use a good number of your pages, record their on-page time, then create a site-specific average from that.
However, you might be even better off using events for this process, instead of something nebulous like dwell time.
You could add event tracking to the amount of the page a user scrolls, and if they scroll more than half a page (for example), an "interactive" event triggers. "Interactive" events have an effect the same as another pageview (without screwing up your pageview metrics) so a single page visit that scrolled at least half way down the page would no longer be recorded as a bounce.
You could also create interactive events for things like pdf downloads, form submissions, sending emails, viewing a video etc that you consider appropriate for your site to negate what would be considered a bounce.
The biggest benefit to this events-based approach is that it would be vastly more accurate. It would track visitors' actual actions, as opposed to just assuming a dwell time meant a valuable interaction. (For example, we all know that the habit of opening multiple tabs at once for sequential reading significantly over-inflates time on page for many users.)
Perhaps that idea would work better for what you're trying to accomplish?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Expired Domains for Back Links - any good?
I've read that PBN's have been targetted by Google and a recent update may have been brought out targeting these. I was going to search for an expired domain with good backlink profile in a similar sort of niche to me. I was going to buy that domain and build a mini website. I would have registered with a paid shared host of £2-3 per month and have it as a site of its own. Then when ranked on Google so i know its not blacklisted I'd link one link back to my money site. I cant see how Google would be able to penalise this. Im presuming a PBN is a large network of these that are used to link to multiple sites rather than doing what I was planning. Is it a good idea to use the tactic I was going to use or am I asking for trouble?
Intermediate & Advanced SEO | | paulfoz16090 -
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
72KB CSS code directly in the page header (not in external CSS file). Done for faster "above the fold" loading. Any problem with this?
To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)
Intermediate & Advanced SEO | | lcourse0 -
"Leeching" backlinks...yes or no?
A lot of websites, by virtue of practicality, will link to wikipedia articles to explain certain concepts. Would it be worthwhile to reach out to those websites and ask them to change the link to a different resource if that resource is a much better alternative than the wikipedia article? And how would you approach this? Thanks!
Intermediate & Advanced SEO | | mack-ayache0 -
Use of <h2class="hidden">- SEO implications</h2class="hidden">
I'm just looking at a website with <h2class="hidden">Main Navigation and <h2class="hidden">Footer inserted on each page, and am wondering about the SEO implications.
Intermediate & Advanced SEO | | McTaggart
<a></a><a></a><a></a><a></a></h2class="hidden"></h2class="hidden">0 -
Google Plus Links - Good for SEO?
I created a link on my Google Plus page under the recommended links with the relevant anchor text and url. It turns out that this is a do-follow link from a webpage with a Page Rank of 8. Is this just too good to be true or have Google genuinely missed something?
Intermediate & Advanced SEO | | MartinHof1 -
Has important is it to set "priority" and "frequency" in sitemaps?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Intermediate & Advanced SEO | | nicole.healthline2