Help on Page Load Time
-
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
-
I would definitely make sure you look at what Google is saying https://developers.google.com/analytics/devguides/collection/gajs/methods/gaJSApiBasicConfiguration#gat.GA_Tracker._setSiteSpeedSampleRate
your Google analytics code is the newest version "asynchronous tracking code" it does make a difference in speed. If you want to track your websites loading speed from certain areas or get a general idea of what you can do to speed it up I strongly recommend http://tools.pingdom.com/fpt/ or http://www.uptrends.com/aspx/free-html-site-page-load-check-tool.aspx both will allow you to check the sites load time from different areas in the United States and the world. If you want to have your site load faster than it does now and you're using Word press I would recommend a different host somebody like WPengine.com or http://page.ly/ if you are using any other form of website you can use a content delivery network somebody like http://www.akamai.com/ does a great job I also use http://www.limelight.com/website-application-acceleration/ for a more complete look at your website load speed and analytics I would recommend Adobe's Omniture http://www.omniture.com/en/ they are more expensive obviously then the free suite from Google however I believe you will see that you do get what you pay for. I also want to bring up Kiss metrics analytics they are only $30 a month and will allow you to track the speed of particular individuals here is a bit of information on the subject from their blog http://blog.kissmetrics.com/speed-is-a-killer/ as well as their main page you can sign up for a free month trial https://www.kissmetrics.com/new_feature
Here is Googles advice on what to do
_setSiteSpeedSampleRate()
_setSiteSpeedSampleRate(sampleRate)
Defines a new sample set size for Site Speed data collection. By default, a fixed 1% sampling of your site visitors make up the data pool from which the Site Speed metrics are derived. If you have a relatively small number of daily visitors to your site, such as 100,000 or fewer, you might want to adjust the sampling to a larger rate. This will provide increased granularity for page load time and other Site Speed metrics. (See Site Speed in the Help Center for details about the Site Speed reports.)
The
_setSiteSpeedSampleRate()
method must be called prior to_trackPageview()
in order to be effective.Analytics restricts Site Speed collection hits for a single property to the greater of 1% of visitors or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
Note: We strongly encourage sites with greater than 1 million hits per day to keep their sample selection set to the default 1% rate. Adjusting the sample size to a larger number will not increase your sample size.
Async Snippet (recommended)
_gaq.push(['_setSiteSpeedSampleRate', 5]); _gaq.push(['_trackPageview']);
<a class="exw-control exw-expanded">▾</a>
Traditional (ga.js) Snippet
pageTracker._setSiteSpeedSampleRate(5); pageTracker._trackPageview();
parameters
_Number_ sampleRate
Value between 0 - 100 to define the percentage of visitors to your site that will be measured for Site Speed purposes. For example, a value of5
sets the Site Speed collection sample to 5%.I hope I was of help to you and wish you luck with this.
Sincerely,
Thomas Zickell
QuiZick Internet Marketing
-
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
-
New tracking codes set this up as default but if this is an old code it will need enabling
by using the folowing
Previous versions of Site Speed required a tracking code change to add _trackPageLoadTime. Sites with the deprecated call will still collect speed data at the 10% sampling rate. However, this call will be ignored in the future, and the sample rate will fall to the default 1%. Consider updating your tracking code to use setSiteSpeedSampleRate() to set a higher sampling rate.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking penalty: Limited to specific pages or complete website?
Hi all, Let's say few pages on the website dropped in the rankings due to poor optimisation of the pages or hit by algo updates. Does Google limits the ranking drop only to these pages or the entire website will have any impact? I mean will this cause ranking drop to the homepage for primary keyword? Will Google pose the penalty to other pages in the website if few pages drop in the rankings. Thanks
Algorithm Updates | | vtmoz0 -
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Reasons for a sharp decline in pages crawled
Hello! I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week. So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks. Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue? FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
Algorithm Updates | | Symmetri0 -
Best Moz article on landing pages?
From what I understand, building landing pages to link back to sites is a thing of the past. I am looking for a good article that explains best current landing page practices (post Panda and Penquin). Any suggestions?
Algorithm Updates | | cschwartzel0 -
Guest Blog Post or Article Content Should be Do-Follow or No-Follow Link ? Help Plz !
Many SEO writers and blogs after Google Matt Cutt said, You should not allow no-follow link in Guest Post. What should we do. ? I am allowing Guest post - what they ask in return a do-follow link to their site or blog. other articles or post i wrote about inspiration collected from different source or single portfolio site - i credit them(as blogger - we should respect them). What i am doing right or wrong ? Please advise and help me on this ! http://searchengineland.com/google-guest-blogging-for-links-you-better-nofollow-those-links-166218
Algorithm Updates | | Esaky0 -
Google Page Rank not improving
Hi All, I have a site live with a homepage rank of 5, Ever since relaunching (on the same domain) 6 months ago the inner page rank has remained at NA. Its crawled pretty consistently, Can anyone think of a reason this may be happening? www.glowm.com
Algorithm Updates | | thebluecubeuk0 -
Lost over 65% of organic visits since Sept - Please help
Hi all, I'm fairly new to SEOmoz, i am here because i dont seem to be getting any actuall help from my SEO company, so am trying to figure this out myself. I have done a quick analysis using google analytics and have gone from 13441 google organic visitors in Sept to 4527 google organic visitors in March. (see attached image GA Sept - March) Visitor numbers seems be to fairly stable in sept, oct, nov, and perhaps a slight decline in Dec, but then on Jan17th seems a big drop, and i have never recovered... (See images GA - Oct, GA - Nov, GA - Dec and GA - Jan ) I am at a bit of a loss, i have heard about google penalties, but there are no warnings in my webmaster tools about this, the only warning i ever got was a message on Dec 24th which read : http://XXX.co.uk/: Big traffic change for top URL Search results clicks for (my site) have decreased significantly. The number of clicks that your site receives from Google can change from day to day for a variety of factors, including automatic algorithm updates. However, if you have recently made significant changes to the content or configuration of your site, this change may be an indication that there are problems. This is not a penalty notice is it? its simply google telling me i have lost a lot of traffic? Strange thing is December's traffic was pretty much fine, so do not understand why i got this message, when the drop in traffic happened a month later. I've been racking my brain for months trying to fix every little possible issue i can find with my site (its not nice, when you do not actually know what google has decided it doesn't like about your site!) but nothing seems to be helping, i've been hiring content writter's as i found loads of websites have copied a lot of content, i also decided maybe the product descriptions are being classed as duplicate so again have got in content writter's to re-write hundreds of product descriptions (all this is costing me an arm and a leg)... Nada! Then today, when doing this google analysis (see image GA Sept - March) i noticed two sites, i have renamed them on the image to: XXX-1.co.uk & XXX-2.ecomm-search.com XXX-1.co.uk was a test server where i could play around with code on my website before i actually implement changes, usually i delete all the files after i use it... but it looks like i forgot to do that.... the site is a complete copy of my website (but obviously a version where you can not actually process an order) but all the pages still have canonical links back to the proper website... Just in case this was causing issues, i have 301 redirected the site back to my main site... Is this wise? or should i just delete the site? XXX-2.ecomm-search.com is a search company i was trialing as i wanted to improve the search functionality on my own site, however the search features are hosted on their site... I was told it has no bearing on SEO as its not indexed... however if google is seeing refferalls coming from that site, which is basically a duplicate of my search pages but with better functionality, could they be considering this a duplicate content issue? If anybody can give me any advice at all about above questions, or in general about what happened on Jan 17th (as i see from web search's many people were affected but i can not seem to find people who actually know what google did) i would be very grateful. Thanks James iNiKRDq jd6FJ3l RzaqTEy 8KCtykz 2m2OGvG
Algorithm Updates | | isntworkdull0 -
Is my page footer the reason keyword rankings have dropped?
Hi all, One of my sites http://henstuff.com/ has seen some ranking drops for major keywords over the past few weeks and I was wondering if it was something to do with Penguin not taking a positive view of link-filled footers. It is something we are looking at phasing out but wanted to get the opinions of the SEOMOZ community. Thanks! Rob
Algorithm Updates | | RobertHill0