Google's weighting of Page Load speed
-
Hey, we recently optimised our website page load speed as part of our overall optimisation.
Page load speed according to the Google Developers Page Speed test was previously 51 out of 100 and is now 92 out of 100 and this was improved within the last seven days gradually, cached half way through the improvement process.
I appreciate this is regarded as only a small part of the whole process, however, I’d be interested to know if anyone has a concrete opinion/proof on whether such a big improvement would actually make a difference to our rankings in the SERPs.
-
Thanks Alex, nice share.
-
Matt Cutts stated that only in 1 out of 100 searches does page speed show a noticeable change in rankings, and there are only 1 in a 1000 sites which have slow speed as a big issue: www.youtube.com/watch?v=SO4YuDAkplU
I have a page on one website which has on-page SEO close to perfection for a relatively competitive term, and a few relevant links coming in. There was a query in the SQL on the page that slowed it down - I think the load time was about 4 or 5 seconds. After a few weeks (it wasn't an important page and I knew it would be an interesting test) I fixed the problem and got the load time down to around a second. The following day the page jumped from nowhere (not even the top 20 pages) to page 4 on Google, so that must have been one of the 1 in a 100.
-
Thanks, some interesting stats. Our main pages have a pretty good bounce rate but the blogs are not as good. I think we may find an advantage here.
-
I think, as Barry says, you'll see more improvement in what your visitors do on your site, then what Google brings you. Don't mix up cause and effect. I think because your visitors stay longer, Google will see that. And reward you accordingly.
The numbers don't say a lot, it's more the actual load time. You'll see that bounce rates and pages per visit will go up. I manage a website where we managed to cut the bounce rate in half. avg visit time up 40%, pages p/visit up 30%. Our visit rate went up by nearly 80%. So that was remarkable. And guess where the visitors came from: Google.
-
Thanks for the reply, sat at position 2 before the change and still sat at position 2 now after the change but hey expect we would be there so no great surprise.
Checked it in Pingdom too as we had researched this before making the changes. Interestingly I ran it 5 mins ago and it said load time 4.3 secs. Ran it again and it comes up as 1.8 secs. Seems a bit buggy today.
My opinion is that it wont make a significant difference on it's own but as part of an overall aim, it makes a slight difference.
Just wondering if anyone has seen significant improvements and can demonstrate them.
-
Well, surely you're now in an excellent position to tell all of us?!
I don't think the issue is whether going from 51 > 92 will help, but more whether you've gone from X seconds to less than 1.5 seconds.
In webmaster tools if you look at labs > site performance, you should see a graph of your time.
Can also use something like - http://tools.pingdom.com/ - to check how fast a page loads.
I would suggest collecting your own data (especially since you've already made the changes) and evaluating whether it's been worth it or not. You should also monitor things like conversions and bounce rate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When looking up a URL in GSC's URL Inspection tool has anyone ever come across a page that has the following page in "Referring Page" --> http://achadda-dev.us.archive.org/
Heres where the URL inspector is saying it found this xml sitemap page QAaLtfh
SEO Learn Center | | SFChronTeam0 -
Planning an analytical article focused on link building. Do you have any suggestions on what you'd like to see analyzed in a data-backed study?
So far, I'm working backwards. Goal: product a data-rich piece of content that will help other agencies or internal marketing departments explain trends in link building and show the value in it. Means to gather data: Moz, Ahrefs, manual searching What is missing right now is the angle. I was thinking about taking 100 high volume keywords in the SaaS/software space, and looking for trends in the sites that rank well with regards to search volumes and number of links etc. This feels a bit weak to me. Any suggestions? I'd appreciate any insights so thank you for taking the time!
SEO Learn Center | | ajaypaghdal1 -
What keywords will this page possibly rank for ... tool?
Hi all, I'm looking for a tool that will look at a web page that you submit to it and tell you what keywords it thinks the page is targetted towards and thus hopefully rank for. Does anyone have any suggestions please? Hopefully this type of tool will help us understand why page A is being ranked for keyword X when page B is actually the page that should be being ranked for that keyword. The only tool I have really found so far is http://www.ranks.nl/tools/spider.html I dont want to tell the tool what keywords I am looking at, as with Moz On page grader. Background reason (and possibly some one maybe able to shed some light on this also). We list a group of products... Nextbase Click 9 Lite, Nextbase Click 9 Lite Duo and Nextbase Click 9 Lite Duo Deluxe. All 3 are similar but very different products and each has its own page on our site. For the keyphrase "Nextbase click 9 lite" currently in Google we are listed for the pages containing the "Nextbase Click 9 Lite Duo" and "Nextbase Click 9 Lite Duo Deluxe"... but not the actual specific product page that targets the "Nextbase Click 9 Lite" product. Thus my reason for wanting to try such a tool that may help me understand why this is. Thanks for any hints, tips or pointers.
SEO Learn Center | | jasef0 -
How to ask Google to remove old pages that don't exist
I have moved a site to a new location. There are a number of pages that date back to 2012 and 2011 and we do not want them anymore. Is there a way to ask Google to remove these pages entirely.
SEO Learn Center | | mayanksaxena0 -
How can I detect Google Webmaster tools without asking my client?
Hi Guys, I am running an audit for a client and one of the things I investigate is if they have a Google and BING webmaster tools . Also I am trying to detect if they have submitted any XML sitemaps. Is there a way for me to be able to detect these by making a simple search on google or not? Thank you Mozzers!
SEO Learn Center | | Ideas-Money-Art0 -
Universal Local Results & Venice Update - What's what?
There is some buzz in the community regarding the Venice Update of the Google Algorithm. Google has said the following about this: "Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal."
SEO Learn Center | | DeptAgency
source:http://insidesearch.blogspot.com/2012/02/search-quality-highlights-40-changes.html A lot of Blog posts refer to this update and discuss the effect of the location of the googler on the ranking of "standard/traditional results". My question is: Does the term 'Local Universal Results' include "standard/traditional" results or is it used by Google to refer to 'Google Places' Results in the standard SERP? So, are the results that are impacted by Venice recognizable as 'Local Results' (by the teardrop and google Maps and so on), or does it influence 'standard/traditiona results' as well, other than simply pushing them down by inserting Google Places results in the SERP? Thanks a lot! 1WWBV.jpg0 -
What are the things I should tell my website developer to keep in mind for on-page SEO wise while outsourcing it?
I have done the keyword research for my target and will do the copy writing in-house suited for both visitors and bots.Like suitable headers and mention of keyword in the content,mention of keyword in link,having video in the page if possible etc. I need some help for a proper interlinking strategy and main question is, what are the things to be taken care of in coding/development SEO wise,which should be told to developer. I am going for custom development in .NET platform( if it matters anyway ) My site will have 3 products/services page and rest are resources and Q&A community. I am targeting the keywords for 3 products. Q&A community and resources should help me in long tail keywords. Any kind of advice,suggestion is welcome. Thanks
SEO Learn Center | | RyanSat0