User generated content (Comments) - What impact do they have?
-
Hello MOZ stars!
I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have?
For your information:
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments.My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why!If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify.
Best regards,
Danne -
Not what you asked, but other than SEO I would say comments do have an effect. I have heard advertisers say they were looking for sites with comments. Their thinking was they wanted popular sites with followers and they is how they judged it.
-
I do think that negative comments hurt UX and eventually the bottom line. No one wants to work with a company that has ton of negative feedback. Which is exactly why user generated content is so important to the searchers. It is a candid review of a company or product. There can be in the middle reviews, like a 3 star rating because customer service was great but the product stinks. I think those kinds of comments and reviews are necessary and overall good for UX.
In my opinion as a consumer, I want to see the bad comments. I always use the example of shoes and clothes. I don't want to find out when I get a pair of shoes in the mail that the sizes run a little small. If I see that in the comments or reviews ahead of time I will know to buy a size bigger and save myself the trouble of returning the product. These kinds of "negative" reviews are useful to a searcher and I wouldn't remove them.
-
Additional to what David said, I would still consider leaving the comments option open (until there is no "over-usage").
Also a factor to consider (especially in Barry's case), what kind of comments do people post. Do they have a positive or a negative annotation? Are they on-topic or not?
If you have a community, like Moz has IMO, where I see a lot of good, complementing comments, responses to each of the posts, I'd consider indexing the comments.
What do you think? David, Monica?
-
I also read that article. Barry seemed to think that the comments were hurting the site, rather than helping. Comments can get off topic, or stray away from the original article. If I remember correctly, Barry made the comments viewable, but not readable by Google as a result.
For return traffic, I think comments are great. After seeing the results that Barry shared, I'm not sure if it is still a good idea to have them included in the page crawl.
Here is the article that he spoke about this: https://www.seroundtable.com/google-panda-ser-poll-19675.html
IMO, I would leave the comments on the pages, but block them from being indexed/use javascript for showing the comments if possible.
-
Like I have mentioned in my response, that is one case.
But I must agree with Monica, you should place the value to the searchers&User Experience.
-
User generated content in my opinion is extremely useful. It is unique, it is informative most of the time and it is valuable to future searches. In this instance I would be more concerned about the value to the searchers and to user experience than the SEO effects.
-
Hi Danne,
I remember reading a post about this from Barry Schwartz on seroundtable.com: https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html
Read it through, it quite describes the effect of user generated content (specially comments).
This is one specific case, I am sure that it is not a general rule for this.
Gr., Keszi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing .html from URLs - impact of rankings?
Good evening Mozzers. Couple of questions which I hope you can help with. Here's the first. I am wondering, are we likely to see ranking changes if we remove the .html from the sites URLs. For example website.com/category/sub-category.html Change to: website.com/category/sub-category/ We will of course make sure we 301 redirect to the new, user friendly URLs, but I am wondering if anyone has had previous experience of implementing this change and how it has effected rankings. By having the .html in the URLs, does this stop link juice being flowed back to the root category? Second question: If one page can be loaded with and without a forward slash "/" at the end, is this a duplicate page, or would Google consider this as the same page? Would like to eliminate duplicate content issues if this is the case. For example: website.com/category/ and website.com/category Duplicate content/pages?
Intermediate & Advanced SEO | | Jseddon920 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Duplicate Content... Really?
Hi all, My site is www.actronics.eu Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!. I know why. Moz classes a page as duplicate if >95% content/code similar. There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model. Here's an example:
Intermediate & Advanced SEO | | seowoody
http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51 Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report). I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others? So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number. One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
(This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php) Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts. Thanks,
Woody 🙂1 -
Any Good XML Sitemaps Generator?
I was wondering if everyone could recommend what XML Sitemap generators they use. I've been using XML-Sitemap, and it's been a little hit and miss for me. Some sites it works great, other it really has serious problems indexing pages. I've also uses Google's, but unfortunately it's not very flexible to use. Any recommendation would be much appreciated.
Intermediate & Advanced SEO | | alrockn0 -
Is hidden content bad for SEO?
I am using this plugin to enable Facebook comments on my blog:
Intermediate & Advanced SEO | | soralsokal
https://wordpress.org/plugins/fatpanda-facebook-comments/ This shows the comment in an Facebook iFrame. The plugin author claims it's SEO friendly, because the comments are also integrated in the WordPress database. The are included in the post but hidden. Is that bad for SEO?0 -
Unnatural links to your site—impacts links
I got message in my Google webmaster tool: Unnatural links to your site—impacts links Does anyone knows the difference between "Unnatural links to your site—impacts links" and "Unnatural links to your site" Thank you Sina
Intermediate & Advanced SEO | | SinaKashani0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0