Lazy Loading of products on an E-Commerce Website - Options Needed
-
Hi Moz Fans.
We are in the process of re-designing our product pages and we need to improve the page load speed.
Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact.
We can have upwards of 50 associated products on a page so need a solution.
So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines.
The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Here's the official page: Making AJAX Applications Crawlable.The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case).It seems complicated but it is not, let's use our gallery as an example.
- Every gallery thumbnail has to have an hyperlink like:
http://www.idea-r.it/...#!blogimage=<image-number></image-number>
- When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number>
Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good.var fragment = Request.QueryString[``"_escaped_fragment_"``];``if
(!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'='
});``if
(escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``}
What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested imageif
(window.location.hash)``{``// NOTE: remove initial #``var
fragmentParams = window.location.hash.substring(1).split(``'='``);``var
imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``}
The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content.
Any advice and discussion welcome
- Every gallery thumbnail has to have an hyperlink like:
-
Ok, cool. To reiterate - with escaped_fragment you are just serving the same content in a tweaked format and Google recommend it rather than frown upon it. Good to be sure though.
See you at SearchLove!
-
Hi Tom, Thank you for the response,
The concern about serving an alt version is that it would be frowned up from a SEO perspective and may lead to a form of penalty.
I agree that escaped_fragment would be the best approach and just wanted to satisfy my own concerns before I get them working on this.
Thank you and see you at Search Love
-
Hi,
I am not sure I follow your concerns around serving an alternative version of the page to search engines - is that concern based on concerns it will be frowned upon or technical concerns?
Using the escaped_fragment methodology would work for your purposes, and would be the best approach. If you have technical concerns around creating the HTML snapshots you could look at a service such as https://prerender.io/ which helps manage this process.
If that doesn't answer your question, please give more information so we can understand more specifically where you concerns are.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find the redirects on website
I want to find the complete internal redirects on website. Just internally linked. How to find such?
Intermediate & Advanced SEO | | vtmoz0 -
Why is our website not generating organic traffic?
Our website is struggling to generate organic traffic over the past few months and we're lost as to why. It's a relatively new site. It's mobile friendly, all products have meta descriptions and product images have alt tags. Content is added regularly. Site speed could be better and alt tags need to be added to images on the blog but other than that we're stumped. Does anyone have any suggestions?
Intermediate & Advanced SEO | | Johnny_AppleSeed0 -
Website not being indexed after relocation
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain. So effectively there were two sites, both more or less identical, with identical content. The first website was thoroughly indexed by Google. The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc. I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site. A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this. For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index. I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here. Many thanks! Matt.
Intermediate & Advanced SEO | | collectedrunning0 -
New Website. Changing TLD or not?
Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?
Intermediate & Advanced SEO | | SEO_ACSI0 -
Do we need breadcrumbs?
I found myself in a weird position today having to defend the use of breadcrumbs.... This is what I wrote.... From an SEO point of view it is best practice to have breadcrumbs as they are high up in the code and help the search engines crawling the site. Do you need a breadcrumb for SEO – Yes – as well as from a usability point of you view users can navigate a breadcrumb instead of hitting the back button. What would you have said?
Intermediate & Advanced SEO | | JohnW-UK0 -
Websites with same content
Hi, Both my .co.uk and .ie websites have the exact same content which consists of hundreds of pages, is this going to cause an issue? I have a hreflang on both websites plus google webmaster tools is picking up that both websites are targeting different counties. Thanks
Intermediate & Advanced SEO | | Paul780 -
Bounce rate and average visit time in an e-commerce site
Dear all, I am managing a Belgian online pharmacy (www.pharma2go.be) . The online pharmacy has a quite high bounce rate (+/- 79%) and low avg. visit time (< 1 minute). This could somehow be related to a choices that have been made in the past to also build the site in English (but without English product texts available - only Dutch and French). The reason is that people all over Europe could order. Another reason could be that also product on prescription are shown which cannot be ordered. This was chosen to still offer the visitors the product leaflets as a service. I am wondering if it would be beneficial for SEO to remove the English version and the on prescription products. At least if this would lower bounce rate and increase the average visit time. Thanks for your input. Kindest regards, Stefaan
Intermediate & Advanced SEO | | stefaanva0 -
Google does not target my website properly!
Hello everyone, My website : www.pentrucadouri.ro, despite is a .ro with romanian content and is hosted in Romania appear for google.ro as a english targeted website.Google see internal pages as romanian ones but main page as english . In order to change this , I added : Also few days ago I uploaded a geositemap and I submitted this to google. Do you have suggestions ? Website ranks 2nd for "cosuri cadou" on google.com and 3rd on bing, but on google.ro ranks 11 . Thanks!
Intermediate & Advanced SEO | | VertiStudio0