User generated content (Comments) - What impact do they have?
-
Hello MOZ stars!
I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have?
For your information:
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments.My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why!If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify.
Best regards,
Danne -
Not what you asked, but other than SEO I would say comments do have an effect. I have heard advertisers say they were looking for sites with comments. Their thinking was they wanted popular sites with followers and they is how they judged it.
-
I do think that negative comments hurt UX and eventually the bottom line. No one wants to work with a company that has ton of negative feedback. Which is exactly why user generated content is so important to the searchers. It is a candid review of a company or product. There can be in the middle reviews, like a 3 star rating because customer service was great but the product stinks. I think those kinds of comments and reviews are necessary and overall good for UX.
In my opinion as a consumer, I want to see the bad comments. I always use the example of shoes and clothes. I don't want to find out when I get a pair of shoes in the mail that the sizes run a little small. If I see that in the comments or reviews ahead of time I will know to buy a size bigger and save myself the trouble of returning the product. These kinds of "negative" reviews are useful to a searcher and I wouldn't remove them.
-
Additional to what David said, I would still consider leaving the comments option open (until there is no "over-usage").
Also a factor to consider (especially in Barry's case), what kind of comments do people post. Do they have a positive or a negative annotation? Are they on-topic or not?
If you have a community, like Moz has IMO, where I see a lot of good, complementing comments, responses to each of the posts, I'd consider indexing the comments.
What do you think? David, Monica?
-
I also read that article. Barry seemed to think that the comments were hurting the site, rather than helping. Comments can get off topic, or stray away from the original article. If I remember correctly, Barry made the comments viewable, but not readable by Google as a result.
For return traffic, I think comments are great. After seeing the results that Barry shared, I'm not sure if it is still a good idea to have them included in the page crawl.
Here is the article that he spoke about this: https://www.seroundtable.com/google-panda-ser-poll-19675.html
IMO, I would leave the comments on the pages, but block them from being indexed/use javascript for showing the comments if possible.
-
Like I have mentioned in my response, that is one case.
But I must agree with Monica, you should place the value to the searchers&User Experience.
-
User generated content in my opinion is extremely useful. It is unique, it is informative most of the time and it is valuable to future searches. In this instance I would be more concerned about the value to the searchers and to user experience than the SEO effects.
-
Hi Danne,
I remember reading a post about this from Barry Schwartz on seroundtable.com: https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html
Read it through, it quite describes the effect of user generated content (specially comments).
This is one specific case, I am sure that it is not a general rule for this.
Gr., Keszi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
How do the Quoras of this world index their content?
I am helping a client index lots and lots of pages, more than one million pages. They can be seen as questions on Quora. In the Quora case, users are often looking for the answer on a specific question, nothing else. On Quora there is a structure setup on the homepage to let the spiders in. But I think mostly it is done with a lot of sitemaps and internal linking in relevancy terms and nothing else... Correct? Or am I missing something? I am going to index about a million question and answers, just like Quora. Now I have a hard time dealing with structuring these questions without just doing it for the search engines. Because nobody cares about structuring these questions. The user is interested in related questions and/or popular questions, so I want to structure them in that way too. This way every question page will be in the sitemap, but not all questions will have links from other question pages linking to them. These questions are super longtail and the idea is that when somebody searches this exact question we can supply them with the answer (onpage will be perfectly optimised for people searching this question). Competition is super low because it is all unique user generated content. I think best is just to put them in sitemaps and use an internal linking algorithm to make the popular and related questions rank better. I could even make sure every question has at least one other page linking to it, thoughts? Moz, do you think when publishing one million pages with quality Q/A pages, this strategy is enough to index them and to rank for the question searches? Or do I need to design a structure around it so it will all be crawled and each question will also receive at least one link from a "category" page.
Intermediate & Advanced SEO | | freek270 -
Duplicate Content... Really?
Hi all, My site is www.actronics.eu Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!. I know why. Moz classes a page as duplicate if >95% content/code similar. There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model. Here's an example:
Intermediate & Advanced SEO | | seowoody
http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51 Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report). I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others? So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number. One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
(This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php) Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts. Thanks,
Woody 🙂1 -
Noindex Valuable duplicate content?
How could duplicate content be valuable and why question no indexing it? My new client has a clever african safari route builder that you can use to plan your safari. The result is 100's of pages that have different routes. Each page inevitably has overlapping content / destination descriptions. see link examples. To the point - I think it is foolish to noindex something like this. But is Google's algo sophisticated enough to not get triggered by something like this? http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-july-november
Intermediate & Advanced SEO | | Rich_Coffman
http://isafari.nathab.com/routes/ultimate-tanzania-kenya-uganda-safari-december-june0 -
Google Generating its Own Page Titles
Hi There I have a question regarding Google generating its own page titles for some of the pages on my website. I know that Google sometimes takes your H1 tag and uses it as a page title, however, can anyone tell me how I can stop this from happening? Is there a meta tag I can use, for example like the NOODP tag? Or do I have to change my page title? Thanks Sadie
Intermediate & Advanced SEO | | dancape0 -
News section of the website (Duplicate Content)
Hi Mozers One of our client wanted to add a NEWS section in to their website. Where they want to share the latest industry news from other news websites. I tried my maximum to understand them about the duplicate content issues. But they want it badly What I am planning is to add rel=canonical from each single news post to the main source websites ie, What you guys think? Does that affect us in any ways?
Intermediate & Advanced SEO | | riyas_heych0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
Login Page = Duplicate content?
I am having a problem with duplicate content with my log in page QuickLearn Online Anytime - Log-in
Intermediate & Advanced SEO | | QuickLearnTraining
http://www.quicklearn.com/maven/login.aspx
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BAM-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BRE-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTAF
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTDF What is the best way to handle it? Add a couple sentences to each page to make it unique? Use a rel canonical, or a no index no follow or something completely different? Your help is greatly appreciated!0