How does Progressive Loading, aka what Facebook does, impact proper search indexation?
-
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue).
I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading?
If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know.
thanks a ton!!
Janet
-
Ok so long time in getting back to this, but here's what the client was actually referring to and found a good post on optimization around the new approach. SEO Tips for Infinite Scrolling | Adam Sherk - excellent read! Wasn't sure exactly what this programmer/client was referring to, but this was it! Thanks all for the help!
-
I think what your developer is talking about, and what Facebook does, is the idea of all of your content being on one page. Progressive loading or "infinite scroll" is when you scroll down to the bottom of a page (like e.g. a category page on your blog) and more content loads on the page itself, as opposed to having to click to page 2 of results to view more content.
The problem with doing this is that even though the content continues to load on the same URL, it's being pulled from another place - so anything beyond that first set of content is being loaded by a JavaScript call. That means that search engines can't index that content that's being loaded as the user scrolls down - and users with JavaScript turned off also won't be able to view the rest of your content. This can be a big problem on main product level pages like your client is thinking of doing, since any links to other products that are beyond that initial page load won't be crawled by search engines.
If you're going to go the infinite scrolling/progressive loading route, make sure that when JavaScript is disabled, there's a crawlable "next page" link to a new, static URL for the next page of results. Basically, make sure that there's a more old-school "previous page/next page" environment with static page URLs that search engines and users without JavaScript can browse, in addition to your progressive loading page.
Here's a link to a similar question from last year that has more information: http://www.seomoz.org/q/infinite-scrolling-vs-pagination-on-an-ecommerce-site
-
That doesn't make much sense to me.
Just look at how Facebook loads... All the text pops right up, and then the images filter in. It just doesn't make any sense to me how it could 'exclude' anything.
Is there a way you could implement it on a small scale just to test it in the beginning? Maybe a page or two, or just a section of the site to start with... Then you would at least have some data to look at and help you make an informed decision.
I haven't been in this part of the world for very long, but I know that progressive loading isn't something that has popped up much in my research/reading. Even when I looked around (briefly) I didn't find anything that connected to SEO.
-
Thank you Modulusman - I thought so too - but the way the programmer was talking made it seem like it was some major exclusion of content or something.
Thanks for your input!
-
I may be wrong here... but isn't progressive loading mostly for images?
If this is what you're talking about.. I'm not sure how it would make much of a difference how things are indexed. It seems like "once upon a time" things had to be saved a certain way, but I'm not even sure that's the case anymore.
It may help with mobile conversion... depending on whether you're more focused on copy or media.
I know this isn't much, but maybe it will jog something for someone.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Replication on Search
Hi. We recently created a Christmas category page on our eCommerce website (christowhome.co.uk). Earlier today, I Googled ‘Christow Christmas Silhouette Lights’ (Christow being the name of our website and Christmas silhouette lights being one of the sub-categories we recently created). I was curious to see how the page appeared on search. Bizarrely, the page appeared multiple times on search (if you click on the link above, it should show you the search results). As you can see, multiple meta titles and descriptions have been created for the same page. This is something that is affecting a number of our Christmas category pages. I don't quite understand why this has happened. We recently added filters to the category. Could the filters be responsible? Any idea how I can prevent this from happening? How I can stop Google indexing these weird replica pages? Many thanks, Dave
Technical SEO | | Davden0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
Single URL not indexed
Hi everyone! Some days ago, I noticed that one of our URLs (http://www.access.de/karriereplanung/webinare) is no longer in the Google index. We never had any form of penalty, link warning etc. Our traffic by Google is constantly growing every month. This single page does not have an external link pointing to it - only internal links. The page has been indexed all the time. The HTTP status code is 200, there is no noindex or something in the code. I submitted the URL on GWMT to let Google send it to the index. It was crawled successfully by Google, sent to the index 5 days ago - nothing happened, still not indexed. Do you have any suggestions why this page is no longer indexed? It is well linked internally and one click away from the home page. There is still the PR of 5 showing, I always thought that pages with PR are indexed.......
Technical SEO | | accessKellyOCG0 -
Which factors are effect on Google index?
Mywebsite have 455 URL submitedbut only 77 URLs are indexed. How can i improve more indexed URL?
Technical SEO | | magician0 -
AJAX and Bing Indexation
Hello. I've been going back and forth with Bing technical support regarding a crawling issue on our website (which I have to say is pretty helpful - you do get a personal, thoughtful response pretty quickly from Bing). Currently our website is set with a java redirect to send users/crawlers to an AJAX version of our website. For example, they come into - mysite.com/category..and get redirected to mysite.com/category#!category. This is to provide an AJAX search overlay which improves UEx. We are finding that Bing gets 'hung up' on these AJAX pages, despite AJAX protocol being in place. They say that if the AJAX redirect is removed, they would index and crawl the non-AJAX url correctly - at which point our indexation would (theoretically) improve. I'm wondering if it's possible (or advisable) to direct the robots to crawl the non-AJAX version, while users get the AJAX version. I'm assuming that it's the classic - the bots want to see exactly what the users see - but I wanted to post here for some feedback. The reality of the situation is the AJAX overlay is in place and our rankings in Bing have plummeted as a result.
Technical SEO | | Blenny0 -
How to Find all the Pages Index by Google?
I'm planning on moving my online store, http://www.filtrationmontreal.com/ to a new platform, http://www.corecommerce.com/ To reduce the SEO impact, I want to redirect 301 all the pages index by Google to the new page I will create in the new platform. I will keep the same domaine name, but all the URL will be customize on the new platform for better SEO. Also, is there a way or tool to create CSV file from those page index. Can Webmaster tool help? You can read my question about this subject here, http://www.seomoz.org/q/impacts-on-moving-online-store-to-new-platform Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
Can changing a host provider impact search rankings?
I was wondering if changing my host provider would impact my search rankings on the major search engines?
Technical SEO | | bronxpad0 -
Site not indexing correctly
I am trying to figure out what is going on with my site listings. Google is only displaying my title and url - no description. You can see it when you search for Franchises for Sale. The site is www.franchisesolutions.com. Why could this happen? Also I saw a big drop off in a handful of keyword rankings today. Could this be related?
Technical SEO | | franchisesolutions0