What is a "good" dwell time?
-
I know there isn't any official documentation from Google about exact number of seconds a user should spend on a site, but does anyone have any case studies that looks at what might be a good "dwell time" to shoot for?
We're looking on integrating an exact time on site into or Google Analytics metrics to count as a 'non-bounce'--so, for example, if a user spends 45 seconds on an article, then, we wouldn't count it as a bounce, since the reader likely read through all the content.
-
I have not seen any studies indicating such a thing,
(but my guess is that is that dwell time seems to be such a strong signal of relevance that google would never release that info, I could be totally wrong though)
An idea to improve UX... if you have a page with 2 paragraphs of text, take the average time it takes for 10 ppl in your office to read it and set the 'bounce rate' accordingly. Then you'll know if ppl are reading it.
If you have a page with 2000 words, avg. that time, etc.
If visitors bounce too soon, edit the text until your office avg. meets visitor avg. That would equal relevance right?
-
The answer to this is really going to be dependent on the page content, Micelleh. A simple page with a clear call to action could result in a user getting exactly what they want from a page within a few seconds and then leaving. A 350 word page might mean 45 seconds, but a 1500 word page might need 2 minutes to prove a user actually got value.
At best, if you insist on a value, get several users to use a good number of your pages, record their on-page time, then create a site-specific average from that.
However, you might be even better off using events for this process, instead of something nebulous like dwell time.
You could add event tracking to the amount of the page a user scrolls, and if they scroll more than half a page (for example), an "interactive" event triggers. "Interactive" events have an effect the same as another pageview (without screwing up your pageview metrics) so a single page visit that scrolled at least half way down the page would no longer be recorded as a bounce.
You could also create interactive events for things like pdf downloads, form submissions, sending emails, viewing a video etc that you consider appropriate for your site to negate what would be considered a bounce.
The biggest benefit to this events-based approach is that it would be vastly more accurate. It would track visitors' actual actions, as opposed to just assuming a dwell time meant a valuable interaction. (For example, we all know that the habit of opening multiple tabs at once for sequential reading significantly over-inflates time on page for many users.)
Perhaps that idea would work better for what you're trying to accomplish?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are "Powered By..." type footer backlinks good or bad for SEO?
Hi guys, We're running a software company which is also selling WP themes amongst other things. We've heard recently that footer backlinks like "Powered by BigBangThemes" might do more harm than good. Some clients usually forget to change them - so we want to make sure we stop including them in case this is true. Thanks!
Intermediate & Advanced SEO | | andy.bigbangthemes0 -
How to answer for question "why xyz site is ranking for abc keyword" and not our website
Hi All, This is a layman question but would like to get a concrete answer for. I would like to know how to answer the questions like "Why our competitor is ranking for keyword ABC but not us"? What metrics or data can I showcase that gives logical answer. Please help in this regard. Thanks!
Intermediate & Advanced SEO | | Avin1230 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Value in adding rel=next prev when page 2-n are "noindex, follow"?
Category A spans over 20 pages (not possible to create a "view all" because page would get too long). So I have page 1 - 20. Page 1 has unique content whereas page 2-20 of the series does not. I have "noindex, follow" on page 2-20. I also have rel=next prev on the series. Question: Since page 2-20 is "noindex, follow" doesn't that defeat the purpose of rel=next prev? Don't I run the risk of Google thinking "hmmm….this is odd. This website has noindexed page 2-20, yet using rel=next prev." Even though I do not run the risk, what is my upset in keeping rel=next prev when, again, the pages 2-20 are noindex, follow. thank you
Intermediate & Advanced SEO | | khi50 -
Wise or cluttery for a website? Should our "out of the mainstream" of popular products be listed on our site? (older/discontinued, umfamiliar brands, parts to products, etc...)
For instance, should we list replacement parts for a music stand? Or parts for a trumpet, like a valve button? To some, this seems like a cluttery thing to do. I suppose another way to ask would be, "Should we only list the high quantity selling items that are well branded and that everyone shops for, and leave the rest off the website for instore customers only to buy?" (FYI: Our website focus is for our local market mainly, and we're not trying to take on the world per-say, but if the world wants in, that's cool too.) (My thought here is that if a customer walks into our retail store and they request an odd ball part or item... we go hunting for it and find it for them. Or perhaps another Music Store needs a part? To me, it's ALL for sale,... right? Our retail depth, should be reflected in our online presence as much as possible,... correct? I'd personally choose to list the odd balls on our site, just as if a customer was standing in the store. Another side thought is, if we only list the main stream products... we are basically lessening our content (which could affect our rankings) and would be inviting ourselves into a higher competitive market place because we wouldn't be saying anything different than what most other music store sites out there say. I believe we need to show off our uniqueness,... and product depth (of course w/good SEO & content too) is really kinda it, aside of course also from good expert people and a large facility. But perhaps that's a wrong way to look at it?) Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Show wordpress "archive links" on blog?
I here conflicting reports on whether to show wordpress archive links on the blog or not. Some say it is important for viewers to see, others say it is not and creates way too many links. I think both have good points but for SEO purposes, I lean towards removing them. What do Moz users think?
Intermediate & Advanced SEO | | seomozinator0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Sounds too good to be true?
Hi all, Speaking to an SEO company at the moment about doing some link building for me but I just can't shake this suspicion that they are a bunch of cowboys. My budget is £1000/month and they are promising 500-1000 high quality links/month. Common sense dictates that surely that would trigger an unnatural link building pattern and at £1-2 /link doesn't sound like they are going to be quality. Is there any scenario where these figures might stack up. Personally I think it's bullshit but thought I'd check it out before telling him to piss off. Thanx
Intermediate & Advanced SEO | | Mulith0