PageSpeed Vs Page Size
-
Hi,
We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements.
However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages.
However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom.
My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective.
Please suggest. Here is my homepage, just as to give you an idea of what i am talking about:
-
Thanks Cyrus and Max,
Very good answer and I am going to work as per your suggestions
-
As Max said, from a ranking perspective, Time to First Byte seems to be the most important factor. The same author of that post offered some tips to improving time to first byte: http://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Oftentimes, you simply have a lot of assets to load and it's difficult to cut anything back. In these cases, the order that things load becomes increasingly important for user experience (asynchronous java script, for example).
Regardless, doing everything you can to improve speed and checking with Google Page Speed Insights is usually the best advice. I've never, ever seen a website where improving speed performance didn't help with traffic metrics (wether rankings or engagement) so I believe it's an investment worth making.
-
What google really cares about is the TTFB (Time To First Byte), to check it just head for GWT, in crawl stats.
To date the general consensus is above 1s is bad and google could penalize you, below .5s is good and google could improve your ranking a little bit.
Google suggest using webpagetest to check a website performance: if you run the test for your website you will se the TTFB is not that bad: http://www.webpagetest.org/result/141124_MF_14DY/
Your overall load time is 10s and I agree is too much, it's supposedly worse your user experience, increasing your bounce rate and alienating some of your visitors. You should work to improve it, webpagetest suggest to compress images and use leverage browser cache, which are good suggestions.
Analyze closely the waterfall to investigate further and identify other areas of interventions.
-
Hi there,
I think it would improve page load if the youtube video was the last to load.
Hope it helps you.
-
You are right! Which is why I dont want to compromise on usability. Thanks for your response
-
give it some time! It should be ok. The main issue with speed should be if the users are fine with it. Think of people before SEO and you ll be fine!
-
Thanks for your response, but the images are possibly as optimized as they could be. I use ImageOptim for Mac to optimize them, they are all jpegs (stripped from all metadata) and enabled for (mild) lossy to WebP on supported browsers.
Do you feel there might be anything else that I could do?
-
Am sure you could work on the optimization a bit more, especially of the images.
none the less if you require the same structure and you are unable to change the size then I would not worry so much about it. Having a fast website is only one of the hundred of different factors that affect SEO. Just work on the other factors and it will be fine!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Page 1 Reached, Further Page Improvements and What Next ?
Moz, I have a particularly tricky competitive keyword that i have finally climbed our website to the 10th position of page 1, i am particularly pleased about this as all of the website and content is German which i have little understanding of and i have little support on this, I am pleased with the content and layout of the page and i am monitoring all Google Analytics values very closely, as well as the SERP positions, So as far as further progression with this page and hopefully climbing further up page 1, where do you think i should focus my efforts ? Page Speed optimization?, Building links to this page ?, blogging on this topic (with links) , Mobile responsive design (More difficult), further improvements to pages and content linked from this page ? Further improvements to the website in general?,further effort on tracking visitors and user experience monitoring (Like setting up Crazyegg or something?) Any other ideas would be greatly appreciated, Thanks all, James
Intermediate & Advanced SEO | | Antony_Towle0 -
Ranking slipped to page 6 from page 1 over the weekend?
My site has been on page one for 2 phrases consistently from May onwards this year. The site has fewer than 100 backlinks and the link profile looks fairly even. On Friday we were on page 1, we even had a position 1, however now we are on page 6. Do you think this is Penguin or some strange Google blip? We have no webmaster tools messages at all. Thanks for any help!
Intermediate & Advanced SEO | | onlinechester0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Not sure what its called - a jump page?
One of my competitors has something implemented on his website where if you click a link thats internal, it tries to load the page and then directs you to the affiliate website. I think its to help his bounce rate because it loads a secondary page, waits a few seconds, then redirects. Is this allowed? If it is, I think I should implement it. Any information would be helpful. He is running wordpress
Intermediate & Advanced SEO | | PrivatePartners0 -
301 vs. 404
If a listing on a website is no longer available to display is it better to resolve to a 301 redirect or use a 404? I know from an SEO point of view a 301 will pass on the link value, but is that as valuable as saying tto the user hey that page is no lonoger available try something else?
Intermediate & Advanced SEO | | AU-SEO0