Help on Page Load Time
-
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
-
I would definitely make sure you look at what Google is saying https://developers.google.com/analytics/devguides/collection/gajs/methods/gaJSApiBasicConfiguration#gat.GA_Tracker._setSiteSpeedSampleRate
your Google analytics code is the newest version "asynchronous tracking code" it does make a difference in speed. If you want to track your websites loading speed from certain areas or get a general idea of what you can do to speed it up I strongly recommend http://tools.pingdom.com/fpt/ or http://www.uptrends.com/aspx/free-html-site-page-load-check-tool.aspx both will allow you to check the sites load time from different areas in the United States and the world. If you want to have your site load faster than it does now and you're using Word press I would recommend a different host somebody like WPengine.com or http://page.ly/ if you are using any other form of website you can use a content delivery network somebody like http://www.akamai.com/ does a great job I also use http://www.limelight.com/website-application-acceleration/ for a more complete look at your website load speed and analytics I would recommend Adobe's Omniture http://www.omniture.com/en/ they are more expensive obviously then the free suite from Google however I believe you will see that you do get what you pay for. I also want to bring up Kiss metrics analytics they are only $30 a month and will allow you to track the speed of particular individuals here is a bit of information on the subject from their blog http://blog.kissmetrics.com/speed-is-a-killer/ as well as their main page you can sign up for a free month trial https://www.kissmetrics.com/new_feature
Here is Googles advice on what to do
_setSiteSpeedSampleRate()
_setSiteSpeedSampleRate(sampleRate)
Defines a new sample set size for Site Speed data collection. By default, a fixed 1% sampling of your site visitors make up the data pool from which the Site Speed metrics are derived. If you have a relatively small number of daily visitors to your site, such as 100,000 or fewer, you might want to adjust the sampling to a larger rate. This will provide increased granularity for page load time and other Site Speed metrics. (See Site Speed in the Help Center for details about the Site Speed reports.)
The
_setSiteSpeedSampleRate()
method must be called prior to_trackPageview()
in order to be effective.Analytics restricts Site Speed collection hits for a single property to the greater of 1% of visitors or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
Note: We strongly encourage sites with greater than 1 million hits per day to keep their sample selection set to the default 1% rate. Adjusting the sample size to a larger number will not increase your sample size.
Async Snippet (recommended)
_gaq.push(['_setSiteSpeedSampleRate', 5]); _gaq.push(['_trackPageview']);
<a class="exw-control exw-expanded">▾</a>
Traditional (ga.js) Snippet
pageTracker._setSiteSpeedSampleRate(5); pageTracker._trackPageview();
parameters
_Number_ sampleRate
Value between 0 - 100 to define the percentage of visitors to your site that will be measured for Site Speed purposes. For example, a value of5
sets the Site Speed collection sample to 5%.I hope I was of help to you and wish you luck with this.
Sincerely,
Thomas Zickell
QuiZick Internet Marketing
-
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
-
New tracking codes set this up as default but if this is an old code it will need enabling
by using the folowing
Previous versions of Site Speed required a tracking code change to add _trackPageLoadTime. Sites with the deprecated call will still collect speed data at the 10% sampling rate. However, this call will be ignored in the future, and the sample rate will fall to the default 1%. Consider updating your tracking code to use setSiteSpeedSampleRate() to set a higher sampling rate.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website and landing pages - Proportionate authority
Does website's (homepage) ranking going to influence landing pages ranking or vice-versa? If the homepage is ranking good for a "keyword", will that improve ranking of other landing pages which are optimised for related "keywords" & Vice-versa?
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
How can I check Googles Page Cache ?
Hi I use to have a handy tool in Firefox (Google Toolbar) that was very handy for checking page ranks and what date a page had been cached. For a while with the newer versions of Firefox I cannot seem to locate this useful tool, Can anybody recommend any useful tools for checking the above. Thanks Adam
Algorithm Updates | | AMG1000 -
Organic listing & map listing on 1st page of Google
Hi, Back then, a company could get multiple listings in SERP, one in Google Maps area and a homepage or internal pages from organic search results. But lately, I've noticed that Google are now putting together the maps & organic listings. This observation has been confirmed by a couple of SEO people and I thought it made sense, but one day I stumble with this KWP "bmw dealership phoenix" and saw that www.bmwnorthscottsdale.com has separate listing for google places and organic results. Any idea how this company did this? Please see the attached image
Algorithm Updates | | ao5000000 -
Double Listings On Page One
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page. I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift. Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this". BTW - This is not effecting any of my Brand SERPs.
Algorithm Updates | | BenRWoodard0 -
I need help with drastic SERP difference between Bing and Google
One of our sites that has been around for a couple of years has about 60,000 pages showing on google, however, bing only shows 90 pages for the site. This same phenomenon has been happening across the board for our sites. Any ideas to improve our indexing results for bing?
Algorithm Updates | | atuomala0 -
Google showing different pages for same search term in uk and usa
Hi Guys, I have an interesting question and think Google is being a bit strange.. Can anyone tell me why when I input the term design agency in Google.co.uk it shows one page, but when i tyupe in the same search term in Google.com (worldwide search) it shows another page.. Any ideas guys? Is this not bit strange?? Any help here be much appreciated.. Thanks Gareth
Algorithm Updates | | GAZ090 -
Yahoo/Bing cache date went back in time
Within 12 hours of submitting a new site to Yahoo/Bing webmasters it was ranking #3 for the primary homepage search term and in the top 5 for about a dozen other. On 7/23 the rankings were steady or climbing with the most recent cache date of 7/21. Now the site only comes up when searching for the domain name with a cache date of 7/11. I launched the site about 14 days ago so I am not expecting results yet but I had never seen this happen so I am just curious if anyone else had.
Algorithm Updates | | jafabel0