Does Lazy Loading Create Indexing Issues of products?
-
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
-
Hello Vinay,
Please see Mashable for an example:
http://mashable.com/2013/4/They have pagination links at the bottom of the page and use lazy loading / infinite scroll.
Adam Sherk has a good post about this:
http://www.adamsherk.com/seo/seo-tips-for-infinite-scrolling/ -
Everett i got ur point you mean ajax for users and pagination for spiders. can you show me one exp. that will help me alot.
Thanks
vinay
-
View the source of the cached page and look toward the bottom. Do all of your products/listings show in the source? If so, you're all good. If not, you may want to add pagination for the spiders, as mentioned above.
-
thanks Everett Sizemore.
I just check cashe as you suggested. and i found lazy loading is also working on cashed page. that means everything ok?
-
Great suggestions Everett: "If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results."
-
Hello,
Where did you read that Google only indexes or reads a maximum of 1,000 links on a page? I think this is outdated information. However, it is best practice not to have that many links on a page even if Google does crawl more than 100 or 1,000 per page.
So to answer your question, yes if you're loading additional product listings via javascript after the user scrolls down it could cause Google to only render part of the page. However, often this does not cause "indexation issues" for your product pages because they have other paths into them from sub-categories, related product links, external links, sitemaps, etc...
If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results. That should answer your question directly from Google.
I usually recommend pagination links be added to the page so Google, or users those without javascrpt enabled, has a path to access more product listings. If you like you can set those paginated category pages to noindex,follow so they do not get indexed, but Google can still crawl them to find deeper products.
-
I have never heard of any negative aspects on SEO when using lazy load... our shop has nearly 100.000 products and we use lazy load as well... we recently could raise the numbers of indexed pages form 5.5 mio. to 6.7 mio. ... so this is just an example but to answer your question from my personal point of view: the answer would be no
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is it still effective to manually create backlinks?
Hi I'm the manager of a training site
White Hat / Black Hat SEO | | jamalinani
My question is why buying backlinks is still effective
Except Google has stated that it will penalize sites that buy backlinks0 -
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Looking for service like one load
Hi,
White Hat / Black Hat SEO | | corn2015
I used to utilize the One Load service and noticed that it no longer exists. Can you please make some recommendations on video marketing platforms that are similar?0 -
More sitemap issues: help
Hey Guys, Seems I'm having more sitemap issues -I just checked my WMT and find that for my com.au and com site - the com.au site is showing i only have 2 pages indexed and 72 Web Pages submitted. The .com I look under sitemaps and it doesn't show any results as to how many pages have been indexed instead it is giving me this error warning - "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." All 3 sites are listed here: http://bit.ly/1KTbWg0 http://bit.ly/1AU0f5k http://bit.ly/1yhz96v Any advice would be much appreciate here! Thanks guys
White Hat / Black Hat SEO | | edward-may0 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Does Google Penalize for Managing multiple Google Places from the same IP Address? Can you manage from same google account or separate? Or does it matter since it's created from the same IP?
I manage a number of client's Google Places from the same IP and heard this is not a good thing. Are there Do's and Don'ts when managing multiple Google Places? Create separate google accounts for each or can you use the same account?
White Hat / Black Hat SEO | | Souk0 -
Creating multiple domains with key phrases and linking back and forth to them
There are several of my competitors who have built multiple sites with keywords in their domain names such as localaustinplumber.com, houstonplumbers.com, Dallasplumbers.com, localdallasplumbingservices.com...you get the picture. (These are just made up examples to illustrate what they are doing) They put unique content on each page and use alias whois using a different credit card to set up each domain to hide the fact from Google that they are the same entity and then link back and forth to each of the domains with appropriate keywords in the anchor text. They are outranking me on a lot of key search phrases due to the fact that they have the keywords in the domain name. They have no other outside links other than the links from the domains that they own. Is this a good idea? is it black hat? are they going to get slapped if someone reports them as a link farm? It's frustrating for me staying white hat and getting legitimate links and then these competitors come in and out rank me after only a few months with this scheme. Is this a common practice to rank highly for certain key phrases? Thanks in advance for your opinions! Ron10
White Hat / Black Hat SEO | | Ron100