Best practices for lazy loading (content)
-
Hi all,
We are working on a new website and we want to know the best practices for lazy loading of google for content.
My best sample is: bloomberg.com , look at their homepage.Thank y'all!
-
Hi John! In order to get you an answer that directly relates to what you're trying to do, would you be able to give us more information about your goals with this? As in, what sorts of pages, specifically, you're intending to implement lazy loading on? And as Sergey asked, is it for the site to load faster for users? For overall user experience?
-
Hey Sergey,
I'm looking for a solution for laziloading and not for pagination in lazy-loading.....thanks anyway..
-
Hi John,
First of all - the Google Webmaster Blog has written about infinite content (although not specifically lazy loading) here. Might be worth checking out.
Second, my question to you would be what is your goal with implementing lazy loading on your site? Is it for the site to load faster for users? For overall user experience?
Here is a thread on Reddit talking about this situation, I think /Twoary explains it well. Here's a quote:
"As far as I have experimented with it, it seems like they can indeed not find scroll-based lazy loading (in webmaster tools). Another possibility is onload lazyloading (first load all the content above the fold, then load the content below the fold after the onload event has fired), I have to experiment more with that.
Right now I avoid lazy loading for SEO for articles and such. The fact is that google only cares about "time to first byte". Maybe soonish they will care about "time until above the fold page is loaded". But they do not penalize for the time it takes for all of the resources to be loaded. Apart from that, google mostly cares about user experience which they measure by actual dwell time of their users.
As for the user experience, lazy loading images doesn't add that much benefit either. The browser downloads images near the top of your page first, so the above the fold content isn't downloaded any faster with lazy load. (Possibly even slower because the browser won't be able to start prefetching lazy loaded images until javascript executes.)
The only benefit I see right now is for reducing bandwidth usage (for your site and for mobile users). However the disadvantage will be that your images probably won't rank as well (even if you use pagination/a sitemap.)
OTOH, lazy loading other heavy content such as videos, iframes and ads may be much more beneficial because those actively make the page more sluggish."
-
Yes, I'm using wordpress.
-
Are you using a CMS? There are some great plugins for various different platforms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Video & Graph That Lazy Loads
Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks
Technical SEO | | Maratha0 -
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
Duplicate Content Reports
Hi Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Best TLD for china
In China there are 2 commonly used tlds .cn and .com.cn. We own both versions for a new domain. Does anyone know if there is research done which one is the best TLD "in the eyes" of the search engines Baidu and Google? Or maybe there is a methodology to select the best? Thanks!
Technical SEO | | Paul-G0 -
Best XML Sitemap generator
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying I am using a MAC so would prefer a online or mac version
Technical SEO | | kevin48030 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0