Help on Page Load Time
-
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
-
I would definitely make sure you look at what Google is saying https://developers.google.com/analytics/devguides/collection/gajs/methods/gaJSApiBasicConfiguration#gat.GA_Tracker._setSiteSpeedSampleRate
your Google analytics code is the newest version "asynchronous tracking code" it does make a difference in speed. If you want to track your websites loading speed from certain areas or get a general idea of what you can do to speed it up I strongly recommend http://tools.pingdom.com/fpt/ or http://www.uptrends.com/aspx/free-html-site-page-load-check-tool.aspx both will allow you to check the sites load time from different areas in the United States and the world. If you want to have your site load faster than it does now and you're using Word press I would recommend a different host somebody like WPengine.com or http://page.ly/ if you are using any other form of website you can use a content delivery network somebody like http://www.akamai.com/ does a great job I also use http://www.limelight.com/website-application-acceleration/ for a more complete look at your website load speed and analytics I would recommend Adobe's Omniture http://www.omniture.com/en/ they are more expensive obviously then the free suite from Google however I believe you will see that you do get what you pay for. I also want to bring up Kiss metrics analytics they are only $30 a month and will allow you to track the speed of particular individuals here is a bit of information on the subject from their blog http://blog.kissmetrics.com/speed-is-a-killer/ as well as their main page you can sign up for a free month trial https://www.kissmetrics.com/new_feature
Here is Googles advice on what to do
_setSiteSpeedSampleRate()
_setSiteSpeedSampleRate(sampleRate)
Defines a new sample set size for Site Speed data collection. By default, a fixed 1% sampling of your site visitors make up the data pool from which the Site Speed metrics are derived. If you have a relatively small number of daily visitors to your site, such as 100,000 or fewer, you might want to adjust the sampling to a larger rate. This will provide increased granularity for page load time and other Site Speed metrics. (See Site Speed in the Help Center for details about the Site Speed reports.)
The
_setSiteSpeedSampleRate()
method must be called prior to_trackPageview()
in order to be effective.Analytics restricts Site Speed collection hits for a single property to the greater of 1% of visitors or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
Note: We strongly encourage sites with greater than 1 million hits per day to keep their sample selection set to the default 1% rate. Adjusting the sample size to a larger number will not increase your sample size.
Async Snippet (recommended)
_gaq.push(['_setSiteSpeedSampleRate', 5]); _gaq.push(['_trackPageview']);
<a class="exw-control exw-expanded">▾</a>
Traditional (ga.js) Snippet
pageTracker._setSiteSpeedSampleRate(5); pageTracker._trackPageview();
parameters
_Number_ sampleRate
Value between 0 - 100 to define the percentage of visitors to your site that will be measured for Site Speed purposes. For example, a value of5
sets the Site Speed collection sample to 5%.I hope I was of help to you and wish you luck with this.
Sincerely,
Thomas Zickell
QuiZick Internet Marketing
-
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
-
New tracking codes set this up as default but if this is an old code it will need enabling
by using the folowing
Previous versions of Site Speed required a tracking code change to add _trackPageLoadTime. Sites with the deprecated call will still collect speed data at the 10% sampling rate. However, this call will be ignored in the future, and the sample rate will fall to the default 1%. Consider updating your tracking code to use setSiteSpeedSampleRate() to set a higher sampling rate.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
Does using parent pages in WordPress help with SEO and/or indexing for SERPs?
I have a law office and we handle four different practice areas. I used to have multiple websites (one for each practice area) with keywords in the actual domain name, but based on the recommendation of SEO "experts" a few years ago, I consolidated all the webpages into one single webpage (based on the rumors at the time that Google was going to be focusing on authorship and branding in the future, rather than keywords in URLs or titles). Needless to say, Google authorship was dropped a year or two later and "branding" never took off. Overall, having one webpage is convenient and generally makes SEO easier, but there's been a huge drawback: When my page comes up in SERPs after searching for "attorney" or "lawyer" combined with a specific practice area, the practice area landing pages don't typically come up in the SERPs, only the front page comes up. It's as if Google recognizes that I have some decent content, and Google knows that I specialize in multiple practice areas, but it directs everyone to the front page only. Prospective clients don't like this and it causes my bounce rate to be high. They like to land on a page focusing on the practice area they searched for. Two questions: (1) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for SEO? The research I've done up to this point appears to indicate "no." It doesn't make much difference as long as the keywords are in the domain name and/or URL. But I'd be interested to hear contrary opinions. (2) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for indexing in Google SERPs? For example, would it make it more likely that someone searching for "anytown usa divorce attorney" would actually end up in the divorce section of the website rather than the front page?
Algorithm Updates | | micromano0 -
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me felicity@gardenbeet.com - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
Why am i not ranking in the top 50 for the keyword 'cocktails' even though all my other cocktail related keywords are in the first 2 pages of Google???
I have checked the first 50 pages of google for my website www.socialandcocktail.co.uk using the keyword 'cocktails'. It is NOT to be found. However, if I search for other keyword combinations eg cocktail recipes, cocktail bars etc they are all in the first 2 pages! What is going on????????
Algorithm Updates | | cocktailboss0 -
AS we using the keyword related to our link but we are not listed in first page of Google search
AS we using the keyword related to our link but we are not listed in first page of Google search, but our competitors using the same keyword , they are listing in first page. how we can short this problem and get into first page on search
Algorithm Updates | | krisanantha0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Why would my product pages no longer be indexed in Google?
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index? If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed. I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index http://www.ebacdirect.com
Algorithm Updates | | ebacltd0