Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
-
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article.
My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar.
They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too.
My question is, is VentureBeat's implementation actually that SEO-friendly or not.
VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles?
Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/
Would be interesting to know if someone has dealt with this first-hand or just has an opinion.
Thanks in advance!
Daniel
-
Totally agreed, Daniel! I'd also say it's our job to set expectations and be clear about when something is a test vs when something will more than likely work. Consulting is all about setting expectations!
-
Thanks a lot for your thoughts on this John. Really appreciate you taking the time to look into it.
You make a great point about not always copying competitors without testing first. If it's rolled out on such a wide scale, it's always going to be a hard case to put to the client knowing that they're going to lose out in the short-term when it comes to advertising revenue but regardless, I think it's our job as SEOs to first and foremost propose the most SEO-friendly implementation possible.
-
This is actually a really interesting question. I looked at their category pages (eg http://venturebeat.com/tag/ar-vr-weekly/) and those seem to be set up correctly to handle infinite scroll as it sends the search engines to the next page.
I've not come across this with infinite scroll on articles, though. I'm sure they've tested it extensively to figure out the best way to send search engines to future articles, but who really knows if it's being effective. If it's still there, I'd assume that they've seen positive signs but it is definitely a non-standard implementation of rel-next/prev!
This does bring up a good point about copying/not copying a competitor's strategy. They have this implemented, but would it work for your own site/business? Maybe, but maybe not. We can't be sure until we test it ourselves (or speak with someone at VentureBeat who wants to share their learnings :-)). If you know when it was rolled out you could benchmark there and look at SEMrush or another tool to see their organic visibility and from there draw at least some correlation, if not causation.
Thanks for flagging this up! It's cool to see.
-
IT depends on application and other design aspects.
I have seen websites that implement the same thing and like morons keep a never accessible footer there as well... you have no idea how impossible it was to get to the social bar/links at the bottom.
You have to think of the user experience to be honest, while there may be good technical reasons for such a design, you must in the end consider what the user goes through and wants to get out of. A/B testing these kinds of things would not hurt either.
But honestly only "feeds" should be this way. Facebook feed, twitter feed, news feed and even then applications should be considered with care.
Disclosure: I personally hate this behavior by default... basically the only place I find it acceptable is on facebook and twitter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO companies that own linking properties
Hi everyone, I do some SEO work for a personal injury attorney, and due to his profession, he gets cold-called by every digital marketing company under the sun. He recently got called by a company that offers packages that include posting in multiple directories (all on domains they own), creating subdomains for search listings, and PR services like writing and distributing press releases for distribution to multiple media outlets. The content they write will obviously not be local. All this and more for less than $500 a month! I'm curious if any of you have any experience with companies like this and whether you consider them black hat. I realize I'm asking you to speculate on a very broad description of what they offer, but their linking strategies sound risky to me. What experiences have you had with companies like this? Do you know anyone who has ever gotten a penalty using these tactics? Thanks, in advance, for sharing your thoughts.
White Hat / Black Hat SEO | | ptdodge0 -
Wanna see Negative SEO?
One of my clients got hit with negative SEO in the past few days. Check it out in ahrefs. The site is www.thesandiegocriminallawyer.com. Any advice on what, if anything, I should do? Google disavow? Thanks.
White Hat / Black Hat SEO | | mrodriguez14401 -
Website rankings plummeted after a negative SEO attack - help!
Hello Mozzers A website of a new client (http://bit.ly/PuVNTp) use to rank very well. It was on the top page for any relevant search terms in its industry in Southern Ontario (Canada). Late last year, the client was the victim of a negative SEO attack. Thousands upon thousands of spammy backlinks were built (suspected to be bought using something like Fiverr). The links came from very questionable sites or just low quality sites. The backlink growth window was very small (2,000 every 24 hours or so). Since that happened that site has all but disappeared from search results. It is still indexed and the owner has disavowed most of the bad backlinks but the site can't seem to bounce back. The same happened for another site that they own (http://bit.ly/1tErxpu) except the number backlinks produced was even higher. The sites both suffer from duplicate content issues and at one point (in 2012) were de-indexed due to the very spammy work of a former SEO. They came back in early 2013 and were fine for some time. Thoughts?
White Hat / Black Hat SEO | | mattylac0 -
I'm Getting Attacked, What Can I Do?
I recently noticed a jump in my Crawl Errors in Google Webmaster Tools. Upon further investigation I found hundreds of the most spammy web pages I've ever seen pointing to my domain (although all going to 404 errors): http://blurchelsanog1980.blog.com/ http://lenitsky.wordpress.com/ These are all created within the last week. A. What the hell is going on? B. Should I be very concerned? (because they are 404 errors) C. What should my next steps be? Any help would be greatly appreciated.
White Hat / Black Hat SEO | | CleanEdisonInc0 -
Partial Match Penalty Site - Move Portion & Redirect To New Site
So I have a site that currently has a partial match penalty from google, I have been working to get it removed...Bad SEO basically my site was submitted to a bunch of bad blog networks..Hopefully it gets lifted soon as we remove and disavow links. That said I was planning on moving a portion of my site to a new site since its not really the focus of the site anymore however still pays the bills. I have also have been building it more of a network of sites..So If I do that and 301 redirect the pages I moved, will the penalty carry? On the current site I planned on using Rel no follow to any links that I may change in the header/menus etc.. Some of these pages I believe have the penalty while others dont. I really just dont want to screw anything else up more then it is? My biggest fear is that its perceived as a blackhat method or something like that? Any thoughts?
White Hat / Black Hat SEO | | dueces0 -
Macrae's Blue Book Directory LIsting
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
White Hat / Black Hat SEO | | EcomLkwd0 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0