Why would an image that's much smaller than our Website header logo be taking so long to load?
-
When I check http://www.ccisolutions.com at Pingdom, we have a tiny graphic that is taking much longer to load than other graphics that are much bigger. Can anyone shed some light on why this might be happening and what can be done to fix it?
Thanks in advance!
Dana
-
Thanks so much Alan for this great response. While I am not as technically savvy as you and Jason, I knew that I shouldn't 100% rely on Pingdom either, so I am very familiar with the other tools you mentioned and use them routinely.
Since my hands are tied as I have no access to either server or source code. as I mentioned to Jason, I will be taking these suggestions to our IT Director to see how far I can get in addressing these issues.
I am on the PageSpeed warpath, and really appreciate your generous response.
I'll let you know what happens!
Dana
-
Thanks so much Jason,
This is great information. As I do not have access to the server or source code, I am going to take your response, in addition to Alan's to our IT Director and see what kind of actions we can take.
It's a bit of a relief to know that the images aren't our biggest problem.
Your comment about 304's is very timely because last week I was scouring through server log files and noticed quite a few 304's. You've pretty much answered my question on why I found so many of those.
These are all the pains of self-hosting with insufficient staff and know-how to set things up properly. Hoepfully, we can get by with a little help from our friends.
Thanks so much!
Dana
-
All great info so far. Let me add some considerations.
CSS images - 16 - total file size - 455,806
Quite often a site references images in CSS files that aren't even displayed on some, most or nearly all pages. They're baked into the CSS style sheet used across part or all of the site.
When this happens, Google crawls all of those images regardless of whether they're displayed. They do so because it's one of their goals to "discover all the content you have". Because of that, their crawler has no choice but to make extra calls to the server for every image referenced.
So every call to the server adds to the page speed that matters most to Google rankings. As a result, if a review of those images shows they are not needed on key pages of the site, consider having a different style sheet created for those pages that doesn't include them in the CSS.
Also, while Pingdom helps to detect possible bottlenecks (I use it solely for this reason) it is NOT a valid representation of potential page speed problems as far as Google's system is concerned. The reason is the Pingdom system does not process a page's content the way the Google system does. So even if Google Analytics reports a page speed of 15 seconds, Pingdom will routinely report a speed a tiny fraction of that.
While not ideal, I always rely on URIValet.com and WebPageTest.org (the '1st run test, not the "2nd run, because that caches processing) to do my evaluation comparisons.
Where I DO use Pingdom, is when I enter in a URL (be sure to set the test server to a U.S. server, not their European server), when the test has been run, I click over to the "Page Analysis" tab. That breaks down possible bottleneck points in file types, process types, and even domains (if you have 3rd party service widgets or code that's a big issue sometimes and this will show the possible problem sources).
For example, for your home page, that report shows 73% of even that system's own time was processing images. And it also shows six domain sources, with 94.49% of the process time coming from your own domain.
Note an interesting thing though - that report also shows 63% of the time was due to "connect" time - meaning more than half of even Pingdom's process was sucked up just connecting wwhich helps reaffirm the notion that if Google has to make many requests of your server, each request has to connect and thus it can add to overall speed.
-
Hey Dana,
Smooshing images is always a best practice, but in your case, I tool a peek at your homepage and your images aren't that poorly optimized. In your case image optimization is going to save you 30K of 176K in images on your homepage. (I still wouldn't discourage you from setting up automated image optimization such as smoosh).
Your bigger performance problems are that you aren't using gzip on your CSS or JS files. Turning on GZip for your .css and .js files would save you 110K out of 236K in text files.
By far the biggest thing you could do to speed up your user experience would be to set a reasonable browser cache for all your static assets. You're website has many assets that are used on every page the visitsor sees (like all the stuff in your header, footer, and nav). The browswer should download those files the first time the visitor hists and pages, and then when they go to every other page, the browser should know it's OK to use the local copy rather than going back to the server to see if their is a newer version. But because their is no browser cache set, the browser is obligated to check with the server every time. In most cases the browser will get an error 304 error when it asks for the same file again (error 304 means the asset hasn't changed since the last time you ask), so the browser uses the local copy, but all that hand-shaking takes time that you could save if you set browser cache times for all your asset.
GZip is #3 on the SEO Tips article you found, Browser Caching is #1, and those are the two things that are costing your particular homepage the most page performance issues.
-Jason
-
Thanks Charles,
Your comments made me curious for more information because I am sooooo not a graphics person. You sent me in the right direction and I appreciate that. I also found this post here at SeoMoz: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Looks like we have some smooshing to do!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you 'close down' a website?
Hello all, If a company acquires a smaller company and 'absorbs' its products and services into its own website, what is the protocol with closing down the smaller company's site? So far we added our branding to the site alerting their visitors to the imminent takeover, and 301 redirected certain pages - soon we'll be redirecting all the pages to their counterparts on the main website. Once that's done, should we noindex the old site? Anything else? Thanks, Caro
Technical SEO | | Caro-O0 -
What's the best way to integrate off site inventory?
I can't seem to make any progress with my car dealership client in rankings or traffic. I feel like I've narrowed out most of the common problems, the only other thing I can see is that all their inventory is on a subdomain using a dedicated auto dealership software. Any suggestion of a better way to handle this situation? Am I missing something obvious? The url is rcautomotive.com Thanks for your help!
Technical SEO | | GravitateOnline0 -
Disallowing WP 'author' page archives
Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?
Technical SEO | | Netrepid0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
The word 'shop' in a page title
I'm reworking most of the page titles on our site and I'm considering the use of the word 'Shop' before a product category. ex. Shop 'keyword' | Brand Name As opposed to just using the keyword sans 'Shop.' Some of the keywords are very generic, especially for a top level category page. Question: Is the word 'Shop' damaging my SEO efforts in any way?
Technical SEO | | rhoadesjohn0 -
Has Google stopped rendering author snippets on SERP pages if the author's G+ page is not actively updated?
Working with a site that has multiple authors and author microformat enabled. The image is rendering for some authors on SERP page and not for others. Difference seems to be having an updated G+ page and not having a constantly updating G+ page. any thoughts?
Technical SEO | | irvingw0 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
After entire site is noindex'd, how long to recover?
A programmers 'accidentally' put "name="robots" content="noindex" />" into every single page of one of my sites (articles, landing pages, home page etc). This happened on Monday, and we just noticed today. Ugh... We've fixed the issue; how long will it take to get reindexed? Will we instantly retain our same positions for keywords? Any tips?
Technical SEO | | EricPacifico0