Google pagespeed / lazy image load
-
Hi,
we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question:
Does the "lazy load" feature for images harm the ranking?
The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google?
Any expirience about that?
Best wishes,
Georg.
-
this does a pretty good job of explaining lazy load
http://www.thesempost.com/lazy-loading-images-likely-will-indexed-google/
-
hey that was a fast response i usually dont get that response from google lol .. anyway post an update, ok? would like to know the answer aswell..
-
Yesterday, I've written a support mail to bing webmastertools. Surprisingly I got a very comprehensive answer within hours! Thumbs up!
The answer: "Yes, you are right. Since this lazy load feature is a 3<sup>rd</sup> party application, as initial troubleshooting steps and to isolate the issue, please try to turn off this feature on your end."
Well, I try to turn off the lazy load for the specific page and see what's happening.
Best wishes,
Georg. -
i think i already answered this question
" what i know is that anything generated by javascript is unreadable by any search engine robot"
so probably thats the reason why its not found on image search engine .. anyway ill wait for other answers too
-
Hi,
test google versus bing:
I am searching results for
site:schicksal.com Freitag, der 13.
Bing, organic: http://goo.gl/bfXAU0 - article found on 1st position
Bing, image search: http://goo.gl/EXDSdv - no search resultsGoogle, organic: http://goo.gl/VIi5C6 - article found on 1st position
Google, image: http://goo.gl/m5SRjA - main article image is found on 1st positionI've done some other quick checks with Bing: The big images are NOT found at the image search, only the teaser images which are on the overview pages.
So, can anybody confirm this behavior? Do Bing have a problem with the lazy load of google.pagespeed?
Best wishes,
Georg.
-
im curious too what i know is that anything generated by javascript is unreadable by any search engine robot.. they just dont know that language its client side .. but the thing with lazy load is that the content is there just the image is not loaded until its shown on screen.. i mean the tags wrapping up the image.. if webmaster tool "fetch as googlebot" could fetch it then you dont have to worry anything.. but still i wanna know others opinion too
-
Just tried to use the Google Webmaster Tool "fetch as googlebot" - the lazy loaded images where shown on the screenshot.
But the question remains: Is it possible that the google bot is not seeing the images for the ranking because the are loaded with javascript?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
Why did Google Index a weird version of my blog post?
i wrote a page - https://domain.com/how-to-do-xyz/ but when doing an inurl search, i see that it is indexed by google as - https://secureservercdn.net/58584.883848.9834983/myftpupload/how-to-do-xyz/ (not actual url) and when i view that page, it is a weirdly formatted version of the page with many design elements missing. this is a wordpress site. Why would this be? thanks, Ryan
Web Design | | RyanMeighan0 -
Having a Subfolder/Subdirectory With a Different Design Than the Root Domain
Hi Everyone, I was wondering what Google thinks about having a subfolder/subdirectory with a different design than the root domain. So let's say we have MacroCorp Inc. which has been around for decades. MacroCorp has tens of thousands of backlinks and a couple thousand referring domains from quality sites in its industry and news sites. MacroCorp Inc. spins off one of its products into a new company called MicroCorp Inc., which makes CoolProduct. The new website for this company is CoolProduct.MacroCorp.com (a subdomain) which has very few backlinks and referring domains. To help MicroCorp rank better, both companies agree to place the MicroCorp content at MacroCorp.com/CoolProduct/. The root domain (MacroCorp.com) links to the subfolder from its navigation and MicroCorp does the same, but the MacroCorp.com/CoolProduct/ subfolder has an entirely different design than the root domain. Will MacroCorp.com/CoolProduct/ be crawled, indexed, and rank better as both companies think it would? Or would Google still treat the subfolder like a subdomain or even a separate root domain in this case? Are there any studies, documentation, or links to good or bad examples of this practice? When LinkedIn purchased Lynda.com, for instance, what if they kept the https://www.lynda.com/ design as is and placed it at https://www.linkedin.com/learning/. Would the pre-purchase (yellow/black design) https://www.linkedin.com/learning/ rank any worse than it does now with the root domain (LinkedIn) aligned design? Thanks! Andy
Web Design | | AndyRCWRCM1 -
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Link juice passing from a .org.uk link to a .org/uk websites
Hi all, A client I am working on had a CMS built in recently which has resulted in all their canonicals tags being taken off the website, and as such the same page with both a .org/uk and .org.uk/uk domain have appeared in the search results and I am wondering what your guys take is on the best cause of action. For further background: Historically they have always used .org.uk/uk (not sure why) for their UK website and used .org/xxx for other countries (they also have a .org splashpage FYI). Having seen the .org/uk pages, and knowing they have to choose one to avoid duplication, they would like to move their uk website to the .org/uk domain to fit in with the rest of the divisions. However due to the historical use of .org.uk/uk their backlink profile contains links to both the .org.uk and .org domains. My question then: would a canonical tag on all the .org.uk/uk pages pointing to the .org/uk pages be strong enough to pass on link juice to the .org/uk pages (from all links pointing to .org.uk) or would a 301 redirect be required in this instance, or indeed would it be best to stay with the .org.uk/uk domain? Thanks, Diana
Web Design | | Diana.varbanescu0 -
Comparing the site structure/design of my live site to my new design
Hi SEOmoz team, for the last few months I've been working on a new design for my website, the old, live design can be viewed at http://www.concerthotels.com - it is primarily focused on helping users find hotels close to concert venues throughout North America. The old structure was built in such a way that each concert venue had a number of different pages associated with it (all connected via tabs) - a page with information about the venue, a page with nearby hotels to the venue, a page of upcoming events, a page of venue reviews. An example of these pages can be seen at: http://www.concerthotels.com/venue/madison-square-garden/304484 http://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 http://www.concerthotels.com/venue-events/madison-square-garden-events/304484 http://www.concerthotels.com/venue-reviews/madison-square-garden-reviews/304484 The /venue-hotels/ pages are the most important pages on my website - and there is one of these pages for each concert venue - they are the landing pages for about 90% of the traffic on the website. I decided that having four pages for each venue was probably a poor design, since many of the pages ended up having little or no useful, unique content. So my new design attempts to bring a lot of the venue information together into fewer pages. My new website redesign is temporarily situated at: (not currently launched to the public) http://www.concerthotels.com/frontend The equivalent pages for Madison Square Garden are now: http://www.concerthotels.com/frontend/venue/madison-square-garden/304484 (the page above contains venue information, events and reviews) and http://www.concerthotels.com/frontend/venue-hotels/madison-square-garden-hotels/304484 I would really appreciate any feedback from you guys, based on what you think of the new site design compared to the old design from an SEO point of view. Of course, any feedback on site speed, easy of use etc compared to the old design would also be greatly appreciated. 🙂 My main fear is that when I launch the new design (the new URLs will be identical to the old ones), Google will take a dislike to it - I currently receive a large percentage of my traffic through Google organic search, so I don't want to launch a design that might damage that traffic. My gut instinct tells me that Google should prefer the new design - vastly reduced number of pages, each page now contains more unique content, and it's very much designed for users, so I'm hoping bounce rate, conversion etc will improve too. But my gut has been wrong in the past! 🙂 But I'd love to hear your thoughts, and thanks in advance for any feedback, Cheers Mike
Web Design | | mjk260 -
How to verify http://bizdetox.com for google webmaster tools
Hey guys i tried to to make a Preferred Domain choice in webmaster tools, but it is not allowing me to save my choice bec its asking me to verify that i own http://bizdetox.com How do i go about doing that and what are the steps I have already verified www.bizdetox.com
Web Design | | BizDetox0 -
Should /dev folder be blocked?
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning... Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content? The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board. Thoughts? What would be the best way to block out your /dev area?
Web Design | | BoulderJoe0