Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Lazy Loading of Blog Posts and Crawl Depths
-
Hi Moz Fans,
We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths.
We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO.
A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic!
I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
-
This is fantastic - Thank you!
-
Lazy load and infinite scroll are absolutely not the same thing, as far as search crawlers are concerned.
Lazy-loaded content, if it exists in the dom of the page will be indexed but it's importance will likely be reduced (any content that requires user interaction to see is reduced in ranking value).
But because infinite scroll is unmanageable for the crawler (it's not going to stay on one page and keep crawling for hours as every blog post rolls into view) Google's John Mueller has said the crawler will simply stop at the bottom of the initial page load.
This webinar/discussion on crawl and rendering from just last week included G's John Mueller and a Google engineer and will give you exactly the info you're looking for, right from the horse's mouth, Victoria.
To consider though - the blog's index page shouldn't be the primary source for the blog's content anyway - the individual permalinked post URLs are what should be crawled and ranking for the individual post content. And the xml sitemap should be the primary source for google's discovery of those URLs. Though obviously linking from authoritative pages will help the posts, but that's going to change every time the blog index page updates anyway. Also, did you know that you can submit the blog's RSS feed as a sitemap in addition to the xml sitemap? It's the fastest way I've found of getting new blog posts crawled/indexed.
Hope that helps!
Paul
-
I'm afraid I don't have an insight into how Google crawls with lazy loading.
Which works better for your user, pagination or lazy loading? I wouldn't worry about lazy loading and Google. If you're worried about getting pages indexed then I would make sure you've got a sitemap that works correctly.
-
Great, thank you
Do you have any insight into crawl depth too?
At what point would Google stop crawling the page with lazy loading? Is it best to use pagination as opposed to infinite scroll? -
With lazy loading, the code can actually still be seen in the source code. That's what Google uses, so you should be fine with using this as it's becoming a common practice now.
-
Yes, it's similar to the BBC page and loads when it is needed by the user so to speak.
It increased the site loading, but do you know at what point Google would stop indexing the content on our site?
How do we ensure that the posts are being crawled and is pagination the best way to go?
-
I'd have to say, not too familiar with the method you are using, but I take it the idea is elements of the page load as you scroll like BBC?
If it decreases the load time of the site that is good for both direct and indirect SEO, But the key thing is can Google see the contents of the page or not? - Use Google Search Console and fetch the page to see if it contains the content.
Also, Google will not hang around on your site, if it doesn't serve the content within a reasonable amount of time it will bounce off to the next page, or the next site to crawl. It's harsh, but it's a fact.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
Hello, My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages. I have contacted my theme company but not sure what could have done this. Any ideas? The original posts/pages are still correct and working it just looks like it did duplicates and added void(0 to the end of each post/page. Questions: There is no way to undo this correct? Do I have to do a redirect on each of these? Will this hurt my rankings and domain authority? Any suggestions would be appreciated. Thanks, Wade
Intermediate & Advanced SEO | | neverenoughmusic.com0 -
Two blogs on a single domain?
Hi guys, Does anyone have any experience of having (trying to rank) two separate blogs existing on one domain, for instance: www.companysite.com/service1/blogwww.companysite.com/service2/blogThese 2 pages (service 1 and service 2) offer completely different services (rank for different keywords).(for example, a company that provides 2 separate services: SEO service and IT service)Do you think it is a good/bad/confusing search engine practice trying to have separate blogs for each service or do you think there should be only one blog that contains content for both services?Bearing in mind that there is an already existing subdomain for a non-profit part of business that ranks for different keywords: non-profit.companysite.comand it will potentially have another blog so the URL would look like: non-profit.companysite.com/blogAny ideas would be appreciated!Thanks
Intermediate & Advanced SEO | | kellys.marketing0 -
Should I delete 100s of weak posts from my website?
I run this website: http://knowledgeweighsnothing.com/ It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings. When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve. I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website? Any and all advice on how to proceed would be greatly recieved.
Intermediate & Advanced SEO | | xpers1 -
Is it worth removing date from Blog Posts / Articles
Wondering, is it worth to remove date from articles from seo perspective. Am sure, Google search algorithm would like demote a post written a year back, as against an article on the same post (unless a year old post has very strong Authoritative links) May be it can turn out a bad user experience of removing dates, but if can hide date using Javascripts so as to show it as image to user and hide it from search engines, is it a good idea !!
Intermediate & Advanced SEO | | Modi0 -
Is Using a Question, Answer Format Appropriate for a Blog? Is a 300 Word Micro Blog An SEO Plus?
My PR agency has suggested a question answer format be incorporated in my blog. They suggest a microblog with a single sentence question and an answer of about 300 words. My blog currently has about 35 posts. I would like to ramp up blog entries to about one or two per week of these "mini blog" posts. The format of the new blog begins as a question with the responses being paragraphs that do not use headings. My concerns are as follows: 1. No headings in an answer of 300 words will fail to provide Google with context regarding the content's meaning. Everything I have read about SEO suggests text be broken up in short sections and that it be divided by headings (preferably H2s). I very much like my agency's concept for a question answer format blog. It provides very practical info for visitors. How can I use it in a manner that supports SEO best practices? 2. According to a reputable SEO firm that has been assisting me, Google does not consider a blog post of less than 600 words to be superior quality. They told me that blog posts of 300 words, from an SEO purpose will not be a great helpful, that the content will not be rich enough to generate incoming links. Is this really the case? What if this abbreviated content is very well written and engaging? If so, is 300 words sufficient? From the visitor's perspective I am not sure they would have the patience to read 600 words when 300 words is more than than enough to answer these basic questions. From a PR perspective I think the shorter content in a question answer format is superior at least for my line of business (commercial real estate brokerage). 3. If 500-600 words is the minimum word count, and headings are necessary, what is the best way to execute a question and answer blog format? The purpose of this blog is to provide very useful info to my visitors while generating incoming links to that will boast my rankings. Thanks in advance for your feedback!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
How reliable is the link depth info from Xenu?
Hi everyone! I searched existing Q & A and couldn't find an answer to this question. Here is the scenario: The site is: http://www.ccisolutions.com I am seeing instances of category pages being identified as 8 levels deep. For example, this one: http://www.ccisolutions.com/StoreFront/category/B8I This URL redirects to http://www.ccisolutions.com/StoreFront/category/headphones - which Xenu identifies as being only 1 level deep. Xenu does not seem to be recognizing that the first URL 301-redirects to the second. Is this normal for the way Xenu typically reports? If so, why is the first URL indicated to be so much further down in the structure? Is this an indication of site architecture problems? Or is it an indication of problems with how our 301-redirects are being handled? Both? Thanks in advance for your thoughts!
Intermediate & Advanced SEO | | danatanseo0 -
Redirect posts from a wordpress.com blog over to a self-hosted blog
Hi All I started a wordpress.com blog with a few posts on it, and these have been shared using social media so links to these exist on Facebook and Twitter. I've decided that its going to be better and more effective to have the blog on my primary domain. How would I setup a redirect from the wordpress.com blog to my self hosted blog? Normally I'd write a .htaccess file but I'm unable to do that over at wordpress.com. I can't even see an option to install plugins, otherwise I would have used the "Redirector" plugin.
Intermediate & Advanced SEO | | blacey0