How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client bought out shop but used existing phone number
We have a client in Nashville who opened his first location on Spring St., then later bought out PAC Auto to open a second location on Dickerson St. Lately, we noticed that the Dickerson location wasn't ranking. I found that the previous business owner at Pac Auto had already built up a good web presence and that sigh our client was using their old number. Basic NAP violation, ok, got it. But what to do next? I decided to update PACs citations with The Car People's business name and website. Where I was unable to edit or where listings were already claimed, I just reported PAC auto as closed. But yesterday I noticed not only was the Dickerson location still not ranking, but the Spring street location had indeed dropped several places too! (edit: I'm referring to local search results here as we don't own the site) What kind of beast have I stirred?! What kind of signals am I sending to Google that are devaluing the Spring st. location? Will things get worse before they get better? What can I do to make some progress on one without hurting the other? Is it worth trying to get the previous business owners logins (not likey)? Talk to The Car People about getting a new number (not impossible)? Is it worth trying to get the site in order to build separate landing pages for each location? Thanks in advance!
Technical SEO | | cwtaylor0 -
How do I handle soft 404s on category pages?
I have a site that provides a service where listings are displayed on site for 30 days, then they expire. These listings are categorized by type. On occasion, categories have no listings available, and Google Webmaster Tools is listing them as Soft 404 errors. It's not possible to remove these categories and 301 redirect to another page. Any suggestions on how to work around the soft 404s?
Technical SEO | | ang0 -
Mysterious drop in the Number of Pages Crawled
The # of crawled pages on my campaign dashboard has been 90 for months. Approximate a week ago it dropped down to 25 crawled pages, and many links went with it. I have checked with my web master, and he said no changes have been made which would cause this to happen. I am looking for suggestions on how I can go about trouble shooting this issue, and possible solutions. Thanks in advance!
Technical SEO | | GladdySEO0 -
The number of pages indexed on Bing DROPPED significantly.
I haven't signed in to bing webmaster tool for a while. and I found that Bing is not indexing my site properly all of a sudden. IT DROPPED SIGNIFICANTLY Any idea why it is behaving this way? (please check the attachment) INg1o.png
Technical SEO | | joony20080 -
What to do with extremely high number of URLs on your site?
Here is the situation: The site has tons of business and personal profiles, the information needed to be categorized as such directories were created in an attempt to keep the URL structure clean - so for example: www.abc.com/product/um/name-here/city-name/state/lastname:3458765 Each profile has a unique ID#, and for some reason there needed to be a category for a user in this case /um/ stands for user name. Webmaster tool steps to resolve state to use an rel=canonical which can be done for that directory /um/ but I am concerned about the bot not being able to find the other pages beyond that directory, like the profile name, city, state associated. So I guess my ultimate question is if I use rel=canonical will the rest of the content not get crawled or indexed as well?
Technical SEO | | TLO0 -
Number of links in new nav bar
Hi We've just had a spanky new design implemented on Hypnosis Downloads.com My concern is over the top nav. While it's nice for users to get those handy dropdowns, it adds a lot of links to every page, and spreads link weight out equally over all sorts of pages. The place I really want the link weight is under the Downloads link - in those categories and so I am thinking about removing dropdowns for everything but this category. Does that sound like a sensible move to you? Is it likely to actually make a difference? Cheers Roger
Technical SEO | | RogerElliott0 -
I have a WordPress site with 30 + categories and about 2k tags. I'd like to bring that number down for each taxonomy. What is the proper practice to do that?
I want to bring my categories down to about 8 or so and the tags... They're just a mess and I'd really like to bring that figure down significantly and setup a standard for usage. My thought was to remove the un-needed tags and categories and setup 301 redirects for the ones that I'm removing. Is that even necessary? Are there tools that can assist with this? What are the "gotchas" I should be aware of? Thanks!
Technical SEO | | digisavvy1 -
How to handle .mobi and normal website for mobile search and regular search
Hi, we have our regular website at jameda.de and a mobile only page at jameda.mobi Users on mobile devices will be automatically redirected to .mobi if they click on a link to jameda.de in the SERPs. What is the best practice to ensure, that Googlebot is indexing jameda.de and Googlebot Mobile is indexing jameda.mobi without duplicate content issues and having Link-Juice benefits on mobile search at the same time? Thanks a lot
Technical SEO | | jameda0