Does Google Analytics Adjusted Bounce Rate Lead to Increase in Average Time per Visitor?
-
Hello,
I just recently implemented adjusted bounce rate onto one of the websites that I track via google analytics. (http://searchenginewatch.com/article/2322974/How-to-Implement-Adjusted-Bounce-Rate-ABR-via-Google-Tag-Manager-Tutorial)
Since doing so, obviously my bounce rate has gone down significantly, nearly half of what it use to be, but I've also noticed an increase in the average time per visitor. In fact, the increase of average time per visitor began the same day I adjusted the bounce rate.
Has this happened to anyone else?
Can someone please explain why/how this may occur?
-
You are correct, adding code to a page to 'adjust' the bounce rate can effect your 'average time per visitor' statistic.
This is because of how google measures the time spent on a page...
Normally, if a user opens one page, then does not visit any more pages on your site, it will count as a bounce (even if the user had remained on that page browsing for 10 minutes). This is because there is only one call made to google analytics when the page is opened. There is no call made to google analytics when the page is closed.
So normally, the 'time on page' is calculated by taking the time stamp of when the current page is opened, and comparing it to when the next page on your site is opened. The difference between the two is your 'time on (previous) page'.
So what happens when a user only opens one page on your site and leaves (bounces)? This will be counted as a 0 second visit (even if the user was on the site for 10 minutes). Thus bringing down the average visit time for all visits.
What happens when you add the 'adjusted bounce rate' code to your page, is that a 2nd call is made after x seconds to the google server.... Allowing google to know that the user has in fact remained on the page for an extended period of time. So now a whole bunch of these '0' second (bounced) sessions will be converted to longer sessions based on the time between the 2 time stamps.
The more 'one page only' visits you have to your site, the more this has the potential to skew your average session time.
On a side note, this will also effect the last page visited of multi-page sessions, as normally google would not know how much time was spent on the last page of the site as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Panda July 2016
Hi Does anyone know what impact the recent slow Panda roll out may have? Obviously content, but would it perhaps include engagement/user behaviour factors regarding your on page content too? Thanks
Algorithm Updates | | BeckyKey0 -
Google Adding / Manipulating Page Meta Titles?
We have a client who is experiencing some heavy google modification to the title tags being displayed on the search engine. It is adding "- 0 Reviews" to an ecommerce site. Obviously a bad start. There were no instances of these keywords anywhere on any of these pages, header tag or otherwise (on only a handful of the affected pages there was a single commented out image with an alt tag 0 reviews - but it was commented out and since removed) We have attempted to rewrite the title multiple times and it will modify the title but still include the non-relevant addition. Has anyone ever experienced anything like this?
Algorithm Updates | | Spindle0 -
Google Page Rank not improving
Hi All, I have a site live with a homepage rank of 5, Ever since relaunching (on the same domain) 6 months ago the inner page rank has remained at NA. Its crawled pretty consistently, Can anyone think of a reason this may be happening? www.glowm.com
Algorithm Updates | | thebluecubeuk0 -
Google Places/Points of Interest Rankings?
Does anyone have an idea on how Google ranks or determines the 'Points of Interests' that come up when searching about places/cities?
Algorithm Updates | | CarlLarson0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
Why is a website with lower content interest reaching higher in google
there is a website that i am competing with <cite>www.gastricbandhypnotherapy.net for the term gastric band hypnotherapy and for some reason it is now ranching higher than me.</cite> I have been number one in google with http://www.clairehegarty.co.uk/virtual-gastric-band-with-hypnotherapy for the term Gastric Band Hypnotherapy but for some reason in the past few days it has ranked number one and pushed me down to number three. i do not understand it as there is not much relevant content to gastric band hypnotherapy and also it does not have many links pointing into it can you please help with this question
Algorithm Updates | | ClaireH-1848860 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0