Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
High resolution (retina) images vs load time
-
I have an ecommerce website and have a product slider with 3 images.
Currently, I serve them at the native size when viewed on a desktop browser (374x374).
I would like to serve them using retina image quality (748px).
However how will this affect my ranking due to load time?
Does Google take into account image load times even though these are done asynchronously? Also as its a slider, its only the first image which needs to load. Do the other images contribute at all to the page load time?
-
"Large pictures tend to be bad for user experience."
I disagree. I think what you mean is slower loading is bad for the user experience. Higher quality pictures are better for the user experience.
I've been looking into deferring loading of the additional slider images. That should definitely improve load time as all the bandwidth can be used to download the first slider image.
Also the first slider image if you use a progressive format should show something quickly and then improve over time.
-
You also have to keep in mind that users will access your site from mobile devices and that the larger the page the longer it takes to load fully. You may lose some people during the time it takes to load the page. My website used to have a slider with three images. i removed the slider and replaced it with one static image. Large pictures tend to be bad for user experience.
-
Hey Dwayne
They are big images but from experience I have never seen a meaningful impact from these kind of changes (in around 15 years). Maybe work on optimising the images themselves as best as possible to bring the overall size down as much as possible. Sure, if your site is a slow loading nightmare and this is just the final straw then it may be an issue but by the sounds of it you are already taking that into consideration and your site is well hosted and performs better than most of everything else out there.
But, as ever in this game, my advice would be to be aware of possible implications, weigh up the pros and cons and then test extensively. If you see an impact in your loading time and search results (and more importantly in user interaction, bounce etc) after changing this one factor then you know you can roll it back.
Hope that helps
Marcus
-
Hi,
Its not that small a change...the size of each image will quadruple from around 10kb to 40kb. As there are three images thats 90kb more data. Which is around 20% of the total page size.
That's interesting what you mention about the first byte load time. I would have thought that was overly simple and would definitely have assumed Google would actually be more concerned with how long it takes for the page "to load" (e.g. using their pagespeed metrics).
I've optimized my site extensively and have pagespeed score of 95% and I use the amazon AWS servers.
I agree with your idea about doing what's right for my users. But if Google includes the image load time then my site will rank poorly and then I won't have any users!
In summary, I think what this question really comes down to is how does Google calculate page load times and does this include image load time and does it include load time for all images (even ones which aren't being rendered in the slider).
Thanks,
Dwayne
-
Hey
I think this is such a small issue overall that you should not worry about a slight increase in image sizes damaging your SEO (assuming everything else is in place).
I would ask myself the questions:
- Is this better for my site users?
- does the seriously impact load times (and therefore usability / user experience)?
If you believe it creates a better experience and does not impact loading times in a meaningful way then go for it and don't worry about a likely negligible impact on loading times.
A few things I would do:
- test average loading times with a tool like pingdom: http://tools.pingdom.com/fpt/
- replace your images and test again
- look at other areas where you can speed up loading times
- make sure your hosting does not suck
For reference there was a post here a while back re the whole loading times / SEO angle that determined it was time to first byte (response time) rather than total loading time that had the impact - this would make total loading time academic from a pure SEO perspective but... it's really not about SEO, it's about your site users and whether this makes things better (improved images) or worse (slow loading) for them.
Seriously - don't worry about this small change too much from an SEO perspective. Use it as an excuse to improve loading time as that is a good exercise for lots of reasons but go with what is right for your users.
Hope that helps
MarcusRef
http://moz.com/blog/how-website-speed-actually-impacts-search-rankinghttp://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com vs .co.uk
Hi, we are a UK based company and we have a lot of links from .com websites. Does the fact that they are .com or .co.uk affect the quality of the links for a UK website?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Dilemma about "images" folder in robots.txt
Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah
Intermediate & Advanced SEO | | Modbargains1 -
Images Returning 404 Error Codes. 301 Redirects?
We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.
Intermediate & Advanced SEO | | garrettkite0 -
Creative Commons Images Good for SEO?
I've been looking at large image packages through iStock, Getty, Fotolia and 123RF, but before spending a bunch of money, I wanted to get some of your feedback on Creative Commons images. Should be worried that something found on Google Images > Search Tools > Usage Rights section can be used without issue or legal threats from the big image companies so long as they are appropriately referenced? AND will using these types of images and linking to the sources have any affect on SEO efforts or make the blog/website look spammy in Google's eyes because we need to link to the source? How are you using Creative Commons images and is there anything I should be aware of in the process of searching, saving, using, referencing, etc? Patrick
Intermediate & Advanced SEO | | WhiteboardCreations0 -
Number of images on Google?
Hello here, In the past I was able to find out pretty easily how many images from my website are indexed by Google and inside the Google image search index. But as today looks like Google is not giving you any numbers, it just lists the indexed images. I use the advanced image search, by defining my domain name for the "site or domain" field: http://www.google.com/advanced_image_search and then Google returns all the images coming from my website. Is there any way to know the actual number of images indexed? Any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau1 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0