Alt tag for src='blank.gif' on lazy load images
-
I didn't find an answer on a search on this, so maybe someone here has faced this before.
I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them.
Thanks! Ted
-
Martijn, thanks for your response. I could see Google saying that on a given page the same source image shouldn't have more than one alt description, and either penalizing such or just picking one of them or ignoring it altogether. It's the penalty I'm concerned with, of course. . But, I can also see that if they are seeing a blank.gif for the source and some code related to lazy loading they may go ahead and give credit to each alt as though a real src was loaded--and maybe even tying it to the real src image name for image search. Just looking for a bit more feedback from real-world experience first..
Has anyone else worked with this and determined if it is a pro or con?
-
Hi Ted, to be honest we do this and I don't see any big issues why we wouldn't do it. The placeholder image will probably get some more credits than usual. But the images we lazy load are loaded via JS and as Google says it can understand JS they should be able to get how we use it (too much assumptions I know, but I have more things to worry about ;-)).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
Disadvantages of linking to uncompressed images?
Images are compressed and resized to fit into an article, but each image in the article links to the original file - which in some cases is around 5Mb. The large versions of the images are indexed in Google. Does this decrease the website's crawl budget due to the time spent downloading the large files? Does link equity disappear through the image links? Either way I don't think it's a very good user experience if people click on the article images to see the large images - there's no reason for the images to be so large. Any other thoughts? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
How to manage images
We have been using Google+ to load our images straight on to our site, we did this to make sure our site loaded fast. google+ delivers them to website at the size we specify, so even if original is say 4000px x 3000px we can ask for them at 100x100 and they send as resized scale. we dont have to manage sizes just the original images and their tagging If we wanted to improve our SEO opportunities should we be doing this another way? Our images show if you look in the image serp but they dont appear on the main serp. How much of a difference would having the images on our own domain rather than having them on Google+ I am working through the recommended list below, would love to hear guys who are doing well with images and have to manage 1000's of them. There are a number of ways to optimise your images to increase your visibility within Google image search, and the chance of being featured within the main search results (as seen in the 'tablet PC' example): Use a short descriptive piece of text featuring desired keywords within the image alt text attribute.
Intermediate & Advanced SEO | | PottyScotty
Save the image using a descriptive file name
Create an Image XML sitemap
Ensure your images directory isn't blocked by robots.txt
Ideally host images on the same domain
And surround the image with related text content to build a stronger page context/association0 -
Link to image (jpg) - Do I benefit? If not how can I?
Doing some research I found a .edu page linking directly to an image on my site. I can't see how this really benefits me so am wondering how to point the link juice somewhere useful, like the page on which the image resides. Can this be done? One idea that just occured to me would be to rename the image and set up a 301 in the .htaccess. Would that work?
Intermediate & Advanced SEO | | Cornwall0 -
Is a separate title tags and description reqquired
is it a separate title tags and description for all pagination pages. We have common title tag and meta desciption of all pagination pages should we need to modify the same
Intermediate & Advanced SEO | | Modi0 -
Help! Why did Google remove my images from their index?
I've been scratching my head over this one for a while now and I can't seem to figure it out. I own a website that is user-generated content. Users submit images to my sites of graphic resources (for designers) that they have created to share with our community. I've been noticing over the past few months that I'm getting completely dominated in Google Images. I used to get a ton of traffic from Google Images, but now I can't find my images anywhere. After diving into Analytics I found this: http://cl.ly/140L2d14040Q1R0W161e and realized sometime about a year ago my image traffic took a dive. We've gone back through all the change logs and can't find where we made any changes to the site structure that could have caused this. We are stumped. Does anyone know of any historical Google updates that could have caused this last year around the end of April 2010? Any help or insight would be greatly appreciated!
Intermediate & Advanced SEO | | shawn810 -
Deferred javascript loading
Hi! This follows on from my last question. I'm trying to improve the page load speed for http://www.gear-zone.co.uk/. Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred. Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be. Anyone have any ideas?
Intermediate & Advanced SEO | | neooptic0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0