Why would an image that's much smaller than our Website header logo be taking so long to load?
-
When I check http://www.ccisolutions.com at Pingdom, we have a tiny graphic that is taking much longer to load than other graphics that are much bigger. Can anyone shed some light on why this might be happening and what can be done to fix it?
Thanks in advance!
Dana
-
Thanks so much Alan for this great response. While I am not as technically savvy as you and Jason, I knew that I shouldn't 100% rely on Pingdom either, so I am very familiar with the other tools you mentioned and use them routinely.
Since my hands are tied as I have no access to either server or source code. as I mentioned to Jason, I will be taking these suggestions to our IT Director to see how far I can get in addressing these issues.
I am on the PageSpeed warpath, and really appreciate your generous response.
I'll let you know what happens!
Dana
-
Thanks so much Jason,
This is great information. As I do not have access to the server or source code, I am going to take your response, in addition to Alan's to our IT Director and see what kind of actions we can take.
It's a bit of a relief to know that the images aren't our biggest problem.
Your comment about 304's is very timely because last week I was scouring through server log files and noticed quite a few 304's. You've pretty much answered my question on why I found so many of those.
These are all the pains of self-hosting with insufficient staff and know-how to set things up properly. Hoepfully, we can get by with a little help from our friends.
Thanks so much!
Dana
-
All great info so far. Let me add some considerations.
CSS images - 16 - total file size - 455,806
Quite often a site references images in CSS files that aren't even displayed on some, most or nearly all pages. They're baked into the CSS style sheet used across part or all of the site.
When this happens, Google crawls all of those images regardless of whether they're displayed. They do so because it's one of their goals to "discover all the content you have". Because of that, their crawler has no choice but to make extra calls to the server for every image referenced.
So every call to the server adds to the page speed that matters most to Google rankings. As a result, if a review of those images shows they are not needed on key pages of the site, consider having a different style sheet created for those pages that doesn't include them in the CSS.
Also, while Pingdom helps to detect possible bottlenecks (I use it solely for this reason) it is NOT a valid representation of potential page speed problems as far as Google's system is concerned. The reason is the Pingdom system does not process a page's content the way the Google system does. So even if Google Analytics reports a page speed of 15 seconds, Pingdom will routinely report a speed a tiny fraction of that.
While not ideal, I always rely on URIValet.com and WebPageTest.org (the '1st run test, not the "2nd run, because that caches processing) to do my evaluation comparisons.
Where I DO use Pingdom, is when I enter in a URL (be sure to set the test server to a U.S. server, not their European server), when the test has been run, I click over to the "Page Analysis" tab. That breaks down possible bottleneck points in file types, process types, and even domains (if you have 3rd party service widgets or code that's a big issue sometimes and this will show the possible problem sources).
For example, for your home page, that report shows 73% of even that system's own time was processing images. And it also shows six domain sources, with 94.49% of the process time coming from your own domain.
Note an interesting thing though - that report also shows 63% of the time was due to "connect" time - meaning more than half of even Pingdom's process was sucked up just connecting wwhich helps reaffirm the notion that if Google has to make many requests of your server, each request has to connect and thus it can add to overall speed.
-
Hey Dana,
Smooshing images is always a best practice, but in your case, I tool a peek at your homepage and your images aren't that poorly optimized. In your case image optimization is going to save you 30K of 176K in images on your homepage. (I still wouldn't discourage you from setting up automated image optimization such as smoosh).
Your bigger performance problems are that you aren't using gzip on your CSS or JS files. Turning on GZip for your .css and .js files would save you 110K out of 236K in text files.
By far the biggest thing you could do to speed up your user experience would be to set a reasonable browser cache for all your static assets. You're website has many assets that are used on every page the visitsor sees (like all the stuff in your header, footer, and nav). The browswer should download those files the first time the visitor hists and pages, and then when they go to every other page, the browser should know it's OK to use the local copy rather than going back to the server to see if their is a newer version. But because their is no browser cache set, the browser is obligated to check with the server every time. In most cases the browser will get an error 304 error when it asks for the same file again (error 304 means the asset hasn't changed since the last time you ask), so the browser uses the local copy, but all that hand-shaking takes time that you could save if you set browser cache times for all your asset.
GZip is #3 on the SEO Tips article you found, Browser Caching is #1, and those are the two things that are costing your particular homepage the most page performance issues.
-Jason
-
Thanks Charles,
Your comments made me curious for more information because I am sooooo not a graphics person. You sent me in the right direction and I appreciate that. I also found this post here at SeoMoz: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Looks like we have some smooshing to do!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some bots excluded from crawling client's domain
Hi all! My client is in healthcare in the US and for HIPAA reasons, blocks traffic from most international sources. a. I don't think this is good for SEO b. The site won't allow Moz bot or Screaming Frog bot to crawl it. It's so frustrating. We can't figure out what mechanism they are utilizing to execute this. Any help as we start down the rabbit hole to remedy is much appreciated. thank you!
Technical SEO | | SimpleSearch0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Traffic on my website hasn't gone up since
Anyone please I am looking for some help!! My website used to get around 40 to 50 visitors a day, as soon as I created the new website and put it live, traffic has dropped by 25%, page authority for the new and some of the old URL's are only 1, but my keywords are still doing well? I have made sure that I have redirected all the old URL's to the new ones, the tracking code in at the end section of the head section. Any ideas anyone?
Technical SEO | | One2OneDigital0 -
100's of Footer Links... what is the safe play?
Hello, One of my clients wants to know what you guys think is the best solution. He sells 100's of templates a month that have a footer link on it pointing to our homepage. Anchor links are "keyword" & "Brand Name" Some are different than others. Do we update the templates so those are no-follow links in the footer? Do we just make all the links to: Brand Name and have them follow? I understand Brand Name is the business name but I am also afraid that Brand name is so close to the money making keyword in the industry and Google might think we are trying to game the system. Looking for your expert opinions!
Technical SEO | | MoosaHemani0 -
Product landing page URL's for e-commerce sites - best practices?
Hi all I have built many e-commerce websites over the years and with each one, I learn something new and apply to the next site and so on. Lets call it continuous review and improvement! I have always structured my URL's to the product landing pages as such: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/top-category/sub-category/product-name Now this has always worked fine for me but I see more an more of the following happening: mydomain.com/top-category => mydomain.com/top-category/sub-category => mydomain.com/product-name Now I have read many believe that the longer the URL, the less SEO impact it may have and other comments saying it is better to have the just the product URL on the final page and leave out the categories for one reason or another. I could probably spend days looking around the internet for peoples opinions so I thought I would ask on SEOmoz and see what other people tend to use and maybe establish the reasons for your choices? One of the main reasons I include the categories within my final URL to the product is simply to detect if a product name exists in multiple categories on the site - I need to show the correct product to the user. I have built sites which actually have the same product name (created by the author) in multiple areas of the site but they are actually different products, not duplicate content. I therefore cannot see a way around not having the categories in the URL to help detect which product we want to show to the user. Any thoughts?
Technical SEO | | yousayjump0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0 -
Pictures 'being stolen'
Helping my wife with ecommerce site. Selling clothes. Some photos are given by producer, but at times they are not too good. Some are therefore taking their own photos and i suspect ppl are copying them and using them on their own site. Is there anyting to do about this - watermarking of course, but can they be 'marked' in anyway linking to your site ?
Technical SEO | | danlae0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0