How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice around removing large section of the website
We are looking at removing a large section of our website that is getting low/no traffic. My current thought of removing this would be to delete the pages and add 301 redirects to a similar page within the site that is not being deleted. This will be removing 400+ pages, does it this make sense? Or should we point them to the homepage? Finally should we do this in one batch or should we slowly remove the pages over the course of a couple weeks. Thanks - appreciate the help in understanding the best practice in terms of SEO.
Technical SEO | | webactive0 -
Proper way of handling wordpress urls and redirects?
I have recently changed some of my urls in wordpress + adding a sub category in the url structure. from
Technical SEO | | Livet
www.mydomain.se/category/subcategory to www.mydomain.se/category/subcategory/subcategory2/ My products are stil under www.mydomain.se/products/cool-product and are not affected. Should I 301 the old url (www.mydomain.se/category/subcategory) to the new (www.mydomain.se/category/subcategory/subcategory2/)? If that is so, can someone recommend a good 301 redirection plugin? Thanks!0 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
Including spatial location in URL structure. Does subfolder number and keyword order actually matter?
The SEOMoz On-Page report for my site brings up one warning (among others) that I find interesting: Minimal Subfolders in the URL My site deals with trails and courses for both races and general running. The structure for a trail is, for example: /trails/Canada/British-Columbia/Greater-Vancouver-Regional-District/Baden--Powell-Trail/trail/2 The structure for courses is: /course/28 In both cases, the id at the end is used for a database lookup. I'm considering an URL structure that would be: /trail/Baden-Powell-Trail/ca-bc-vancouver This would use the country code (CA) and sub-country code (BC) along with the short name for the region. This could be good because: it puts the main keyword first the URL is much shorter there are only 3 levels in the URL structure However, there is evidence, from Google's Matt Cutts, that the keyword order and URL structure don't matter in that way: See this post: http://www.seomoz.org/q/all-page-files-in-root-or-to-use-directories If Matt Cutts says they aren't so important then why are they listed in the SEOMoz On-Page Report? I'd prefer to use /trail/ca-bc-vancouver/Baden-Powell-Trail. I'll probably do a similar thing for courses. Is this a good idea? Thoughts? Many thanks, in advance, for your help. Cheers, Edward watch?v=l_A1iRY6XTM watch?v=gRzMhlFZz9I
Technical SEO | | esarge0 -
As an agency, what is the best way to handle being the webmaster and hosting provider for several sites (some of which are in the same industry and have natural links to each other)?
We are an agency that builds and hosts websites for several companies (some of which happen to be in the same industry - and therefore naturally link to each other - we do not dictate). In regards to handling their domain registrations, webmaster tools account, google analytics account, and servers, what is the best practice to avoid Google thinking that these companies are affilliated? Even though they aren't affiliated, we are afraid that us being the "webmaster" of these sites and having shared servers for them that we may be affecting them.
Technical SEO | | grayloon0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0 -
Parameter handling (where to find all parameters to handle)?
Google recently said they updated their parameter handling, but I was wondering what is the best way to know all of the parameters that need "handling"? Will Google Webmaster find them? Should the company know based on what is on their site? Thanks!
Technical SEO | | nicole.healthline0