How to handle large numbers of comments?
-
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k!
As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing!
What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account.
Does anyone have any particular recommendations? Options I've considered are:
- Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!)
- Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?)
- Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors)
How do active comments on a page contribute to an article's freshness?
Any thoughts would be greatly appreciated.
-
Hi Paul. On many CMS's you'll find that the additional comments don't change the page's Last Modified http header or indeed the posted date in the body of the article. The comments are so far down the page that their perceived importance is going to be pretty low.
That said, active commends do show that there's significant visitor engagement which has got to be a good thing!
Interesting question about running a poll regarding the order of comments. I think however the order of the comments can work either way depending on the content/context.
For example, "news" type articles with a relatively short shelf-life tend to work better with comments in chronological order. There tend to be fewer comments (which dry-up as the article ages) so the ability to follow disussions in the comments is greatly improved.
For "ever-green" content it doesn't work so well. It can be jarring to come to the comments and be presented with one from 5 years ago!
The other SEO issues related to comments (especially out of the box on many CMS's) is the use of links (followed or no-followed).
If I've got a VERY popular page that's earning lots of real links, having all those links in the comments is going to be eating into the page equity that's going to be available to other pages I'm linking to on my own site. Paginating comments might be one way affect this?
I'm hoping to get some time to make the changes to the page in question - it'll be interesting to see what (if anything) changes!
Thanks!
-
My understanding of the freshness aspect of the algorithm is that just adding or changing content on a page won't help it look more "recent" to the SE's. So new comments aren't really a benefit there.
As a user, I prefer comments that appear in chronological order, but I know many who prefer reverse chrono. That would be a really good question for an interactive poll on the site. If visitors are that engaged with comments, you'd likely get a large enough response to be statistically significant.
The big SEO issue I encounter from large numbers of comments is that all the extra content can dilute the original keyword focus of the page as you created it. Sure, there may be long-tail phrases introduced, but if they start to override the terms you were originally trying to focus on & rank for, things can get messy. Not suggesting dropping comments, obviously, but paginating them with a canonical back to the original post might at least partly help.
I'm also curious whether, if the comments all repeat the target key phrases to frequently, the page could look keyword stuffed. have no proof of that, unfortunately, just the suspicion.
And yea, whatever you decide will definitely have to address the page speed issue for visitors.
Paul
-
Thanks Greg, I'd not considered "lazy loading", although while this is going to help with loading times I'm still a little concerned about page size! At least with user controlled pagination it's their choice to load more comments...
-
Thanks EGOL. Totally understand your point about respecting visitors who take the time to leave a comment. What makes it harder is that effort is being spent answering questions/engaging visitors in the comments which gets lost is we arbitrarily cut off comments.
-
Thank you!
I see that now. That looks great. Visitors can get to all comments but pageload time is saved.
-
EGOL, just to clarify...
With Lazy Loading and displaying only 20 comments, more comments get displayed when you scroll down, rather than having the page load all 3000 comments at once.
In other words, the comments wont be hidden, just tucked away and loaded as needed, when scrolling down the page.
http://whatis.techtarget.com/definition/lazy-loading-dynamic-function-loading
Greg
-
I would paginate.
People who leave comments may come back a couple days later to see the comments left after theirs. I think that it would be disrespectful of these dedicated visitors to show only some of the comments.
Take care of these people. They are your most important asset.
-
I would go with your first point.
The more content on the page the better. Even better is user generated content!
Perhaps for user experience, display only 20 comments and wrap the wrest under "lazy loading" (suggestion from developer sitting next to me)
In other words, let the bots see all 3000 comments on the same page, but for user experience so the page doesn't take days to load, incorporate the "lazy loading" feature....
GREG
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge number of crawl anomalies and 404s - non- existent urls
Hi there, Our site was redesigned at the end of January 2020. Since the new site was launched we have seen a big drop in impressions (50-60%) and also a big drop in total and organic traffic (again 50-60%) when compared to the old site. I know in the current climate some businesses will see a drop in traffic, however we are a tech business and some of our core search terms have increased in search volume as a result of remote-working. According to search console there are 82k urls excluded from coverage - the majority of these are classed as 'crawl anomaly' and there are 250+ 404's - almost all of the urls are non-existent, they have our root domain with a string of random characters on the end. Here are a couple of examples: root.domain.com/96jumblestorebb42a1c2320800306682 root.domain.com/01sportsplazac9a3c52miz-63jth601 root.domain.com/39autoparts-agency26be7ff420582220 root.domain.com/05open-kitchenaf69a7a29510363 Is this a cause for concern? I'm thinking that all of these random fake urls could be preventing genuine pages from being indexed / or they could be having an impact on our search visibility. Can somebody advise please? Thanks!
Technical SEO | | nicola-10 -
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
Sudden jump in the number of 302 redirects on my Squarespace Site
My Squarespace site www.thephysiocompany.com has seen a sudden jump in 302 redirects in the past 30 days. Gone from 0-302 (ironically). They are not detectable using generic link redirect testing sites and Squarespace have not explanation. Any help would be appreciated.
Technical SEO | | Jcoley0 -
How to Handle Website Merge?
We are a law firm and have another law firm merging into ours. Our branding will remain the same, but I am trying to figure out how to best handle their website transition. Should we link it to ours (although their PR & page authority are not significant) or should I map each page to ours with similar content with a redirect? MY main concerns are not damaging our website's SEO by doing something search engine's would frown on and also to try to take advantage of any organic traffic or referral traffic. Or maybe some combination - link homepage with added verbage that attorney is now with our firm and a link and redirect the sub-pages? I look forward to thoughts from anyone who might have experience with this type to issue. Thanks in advance! JulieHow t
Technical SEO | | JulieALS0 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
Seek help correcting large number of 404 errors generated, 95% traffic halt
Hi, The following GWT screen tells a bit of the story: site: http://bit.ly/mrgdD0 http://www.diigo.com/item/image/1dbpl/wrbp On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT. My traffic had been steady at about 1000 clicks/day. At midnight on 2/10, google traffic completely halted, down to 11 clicks/day. I submitted a recon request and was told 'no manual penalty' Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then. By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th. I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors! I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors. However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'. I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$. Assuming that the large number of 404 internal errors is the reason for the sudden shutoff... How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'.. Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP?? thanks
Technical SEO | | mantucket0 -
Number of Indexed Pages in Webmaster Tools
My # of indexed pages in Webmaster Tools fluctuates greatly. Compared to the # of URLs submitted (4700), we have 3000 indexed. The other day, all 4700 were indexed. Why does it keep changing? I obviously want all of them indexed right? What can I do to make that happen?
Technical SEO | | kylesuss0 -
Including spatial location in URL structure. Does subfolder number and keyword order actually matter?
The SEOMoz On-Page report for my site brings up one warning (among others) that I find interesting: Minimal Subfolders in the URL My site deals with trails and courses for both races and general running. The structure for a trail is, for example: /trails/Canada/British-Columbia/Greater-Vancouver-Regional-District/Baden--Powell-Trail/trail/2 The structure for courses is: /course/28 In both cases, the id at the end is used for a database lookup. I'm considering an URL structure that would be: /trail/Baden-Powell-Trail/ca-bc-vancouver This would use the country code (CA) and sub-country code (BC) along with the short name for the region. This could be good because: it puts the main keyword first the URL is much shorter there are only 3 levels in the URL structure However, there is evidence, from Google's Matt Cutts, that the keyword order and URL structure don't matter in that way: See this post: http://www.seomoz.org/q/all-page-files-in-root-or-to-use-directories If Matt Cutts says they aren't so important then why are they listed in the SEOMoz On-Page Report? I'd prefer to use /trail/ca-bc-vancouver/Baden-Powell-Trail. I'll probably do a similar thing for courses. Is this a good idea? Thoughts? Many thanks, in advance, for your help. Cheers, Edward watch?v=l_A1iRY6XTM watch?v=gRzMhlFZz9I
Technical SEO | | esarge0