Google User Click Data and Metrics
-
Assuming that Google is using click data from users to calculate rankings (bounce rate, time on site, task completion, etc.) where does Google get the data, especially from browsers that aren't Chrome?
-
This was example with GA. I believe that they use dwell time and next or subsequent searches for this.
Because they can't fight against shopping cart abandonment's and other issues. So they have some as benchmark against other sites. If your metrics are above average in your industry then it's great. If your metrics are weak - you're in trouble. You can see benchmarking in Google Analytics. So whatever you do just try to make better metrics than them. Example - i just have seen that some of mine sites have pages/session 1.40 vs 2.99 in benchmark. Also mine session duration is 1:32 vs. 2:19 in benchmark.
Similar metrics are in PPC too - you need to be above the average for better positions, prices and conversions.
I know that all this explanation can sound little bit messy... but this is question all SEO specialists think about these days. If you know the answers you can become millionaire and retire quick.
-
But how does Google measure form completions or purchases for rankings?
Again, I'm not talking about Google analytics. We use it heavily for our ecommerce sites. I know how the UA tracking code works. Google claims that they don't use GA data for rankings, and I would tend to believe them.
-
That's tricky. There are lot of theories about Analytics, Chrome, AI, RNN, etc. Of course there also lot of speculations too!
BUT here Josh Bachynski explain that task completion is correlated with with user metrics - time on session, bounce rate and average pages per session. Also others - please note subsequent search in mine prev answer. So in theory sites with better time, less bounce are considered as high quality. You can check also other videos from Josh in YouTube where he explain this many times.
One of easiest way to track task completion is to add goals in Analytics and/or add events tracking too. Goals can be different - contact form filled, lead form filled, software download, whitepaper request, signup form, playing video, etc. Events can be - comments viewed, gallery viewed, video stopped, etc.Then you can see how many of your visitors do tasks and how many do events. This will be for your own insurance that they're inside of page and do something there.
Trick is that Google will use only SERP visitors and their metrics. I can have site with 20k visitors daily from Facebook/Twitter and only 200 from Google SERP. I don't saying that 20k visitors can be wrong, but they will be almost useless for clicking test. Things will be different if we have 20k daily from SERP and 200 from Facebook/Twitter.
So - whatever you do just when you receive SERP traffic keep it in site. This is higher priority for better ranking.
-
Thanks for the answer. Spot on.
There's been a lot of speculation on "task completion" and how it relates to ranking. If completing a task is a purchase on an ecommerce site, how is Google measuring it? Is it only through Chrome or by some other means?
how does Google measure when someone completes a form?
is that possible, or is Google just checking to make sure that the cart and the form work correctly? Was that the point of the "Zombie" update?
-
If you remember before 5 years ago all urls was unencrypted in SERP and lot of tools using this for capturing "keywords" and linking them to pages. After they introduce this in 2010 they begin rollout in few years and today only way to see keywords is in SearchConsole. Of course encryption is for "to improve your search quality and to provide better service". Original text can be seen here. Please note "provide better service" there. This is tricky!
So imagine that you search for moz and here is actual URL i can see now:
https://www.google.bg/search?q=moz&ie=utf-8&oe=utf-8&gws_rd=cr&ei=4wNVVpnZBYGoUZiXh4AG
you can definitely see keyword there in ?q=moz now first result is Moz.com and it's URL is:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFggfMAA&url=https%3A%2F%2Fmoz.com%2F&usg=AFQjCNHNW83KUfvLcZOMILlYW49NobxUig&sig2=nOVvQ05KIPrGB3XFAFmIGgAs you can clearly see - there isn't keyword anymore but everything comes with encrypted data (ved, usg, sig2). This link /url is actual redirector that count your click on specific result and position.Now if i click on 1st result and go in Moz.com i can scroll down and i find "this isn't MOZ i'm looking for" so within some time (few seconds) i will return to SERP. This is actual "dwell time" and bounce back to SERP. It's negative signal because it's show to Google that result he return for first place isn't correct with human verification. Now back on same SERP i can see Moz in Wikipedia:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=19&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFghhMBI&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FMoz_(marketing_software)&usg=AFQjCNGCgqmsKNIdaZdGrbugf8bJk6NhTg&sig2=jS-vt68NFtD5YhgSV4lTGwIf i click this and i doesn't have return to SERP anymore this give to Google enough to calculate bounce rate for this site (only to return in SERP) so give Wikipedia some "goal completition". And time for next search can be used to calculated "time on site".And since all searches are encrypted they knows when specific user search for something and when they make new search based on already returned data. Example is "Napoleon". This can be anything - french emperor, movie, cake, drink and other things. So now i can do subsequent search "Napoleon height". This is example how one search can give me enough information to do another refined search. Other good example can be "32 us president". Then i can type "franklin d roosevelt height".
This was explained much better in closing MozCon 2015 presentation "SEO in a Two Algorithm World ":
http://www.slideshare.net/randfish/onsite-seo-in-2015-an-elegant-weapon-for-a-more-civilized-marketer
and you should see it. There also shown few tests inside with terrific results. -
I guess I should have phrased the question a little differently. This is not related to Google Analytics.
When I do a Google search, Google is able to track my actions, and is probably using the data as a ranking factor. Josh Bachynski did a Whiteboard Friday on it.
https://moz.com/blog/panda-41-google-leaked-dos-and-donts-whiteboard-friday
How is Google able to track user actions after they click on a SERP listing? Where are they getting their data?
-
Here is a good explanation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Update
My rank has dropped quite a lot this past week and I can see from the Moz tools that there is an unconfirmed Google update responsible. Is there any information from Moz on this?
Intermediate & Advanced SEO | | moon-boots0 -
Fix Google Index error
I changed my blog URL structure Can Someone please let me how to solve this?
Intermediate & Advanced SEO | | Michael.Leonard0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Google Translate for Unique Content
We are considering using the Google Translation tool to translate customer reviews into various languages for publication as indexable content both for users and for search engine long tail visibility and rankings. Does anyone have any experience, insights or caveats to share?
Intermediate & Advanced SEO | | edreamsbcn0 -
Website is not getting indexed in Google! Not sure why?
I just came up with my new blog, its not live yet but the 1<sup>st</sup> landing page is ready, up and running… all is fine but here is the only problem is its not getting indexed in Google and I am not really sure why? .xml sitemap is there Google webmaster and analytics are there Website contain at least that much real social shares that it should get indexed in Google Few Links may be coming from Famous Bloggers and SEOmoz (both sites are very authentic in their respective domains) It’s the 4 day the website is up I don’t think website is not getting indexed in Google just because it contains 1 landing page and a thank you page! Any clue or help will be appreciated. www.setalks.com is the domain
Intermediate & Advanced SEO | | MoosaHemani0 -
Microdata and dinamic data.
Hi, everybody! We're starting up a local services website in Brazil. Something like redbeacon.com or thumbtack.com, but obviously different. So we are developing our 2.0 version of the site, and I want do put microdata in every provider's pages, to rank people's evaluation about this particular provider, and geographic information about him. Ok, we want to use microdata in several pages, but those are more important: the providers. These data (geo and rank) will be dynamically generated from our database. In Schema.org, I only found information about using static data to build microdata for my intentions. My doubt is: does google and bing and yahoo and etc index dynamic generated data? Is there something about sitemaps.xml or robots.txt that I can do to have my data indexed on search engines? Our front-end is the guy who deal with html and our codemaster uses pure php for coding. Thanks!
Intermediate & Advanced SEO | | ivan.precisodisso0 -
Will this get penalized by google?
I had a thought recently, and perhaps it is a pretty bad thought, but i don't see the flaw in it, or how google would really detect it, so please correct me where I am wrong here. Say we ran some sort of marketing campeign and through that campeign we created about 100 extra pages on our domain. A lot of these pages are heavily shared on facebook, twitter, google+ etc. These pages also have several backlinks here and there. Now this campaign is over and so these pages no longer seem relevant to us. If we were to add 301 redirects to all these pages, to three different (and unrelated) internal pages (our primary targets) would this pass all the accumulated link juice on to those three target internal pages? Or would this behaviour get penalized by google?
Intermediate & Advanced SEO | | adriandg0 -
Google Places / Google Analytics
I apologize first if this comes across as extremely novice, but I realized I really didn't know the answer and so - here I am. 🙂 Is anyone familiar with tracking google place traffic in google analytics? Is it possible? I'd love to know how many of our visitors are coming from our google place listings (we have several locations throughout the state.) Much gratitude in advance ~ Alicia
Intermediate & Advanced SEO | | Aaronetics0