User generated content (Comments) - What impact do they have?
-
Hello MOZ stars!
I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have?
For your information:
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments.My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why!If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify.
Best regards,
Danne -
Not what you asked, but other than SEO I would say comments do have an effect. I have heard advertisers say they were looking for sites with comments. Their thinking was they wanted popular sites with followers and they is how they judged it.
-
I do think that negative comments hurt UX and eventually the bottom line. No one wants to work with a company that has ton of negative feedback. Which is exactly why user generated content is so important to the searchers. It is a candid review of a company or product. There can be in the middle reviews, like a 3 star rating because customer service was great but the product stinks. I think those kinds of comments and reviews are necessary and overall good for UX.
In my opinion as a consumer, I want to see the bad comments. I always use the example of shoes and clothes. I don't want to find out when I get a pair of shoes in the mail that the sizes run a little small. If I see that in the comments or reviews ahead of time I will know to buy a size bigger and save myself the trouble of returning the product. These kinds of "negative" reviews are useful to a searcher and I wouldn't remove them.
-
Additional to what David said, I would still consider leaving the comments option open (until there is no "over-usage").
Also a factor to consider (especially in Barry's case), what kind of comments do people post. Do they have a positive or a negative annotation? Are they on-topic or not?
If you have a community, like Moz has IMO, where I see a lot of good, complementing comments, responses to each of the posts, I'd consider indexing the comments.
What do you think? David, Monica?
-
I also read that article. Barry seemed to think that the comments were hurting the site, rather than helping. Comments can get off topic, or stray away from the original article. If I remember correctly, Barry made the comments viewable, but not readable by Google as a result.
For return traffic, I think comments are great. After seeing the results that Barry shared, I'm not sure if it is still a good idea to have them included in the page crawl.
Here is the article that he spoke about this: https://www.seroundtable.com/google-panda-ser-poll-19675.html
IMO, I would leave the comments on the pages, but block them from being indexed/use javascript for showing the comments if possible.
-
Like I have mentioned in my response, that is one case.
But I must agree with Monica, you should place the value to the searchers&User Experience.
-
User generated content in my opinion is extremely useful. It is unique, it is informative most of the time and it is valuable to future searches. In this instance I would be more concerned about the value to the searchers and to user experience than the SEO effects.
-
Hi Danne,
I remember reading a post about this from Barry Schwartz on seroundtable.com: https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html
Read it through, it quite describes the effect of user generated content (specially comments).
This is one specific case, I am sure that it is not a general rule for this.
Gr., Keszi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Benefit of internal link in content
Hi, Is there a real benefit to having internal links in content other than at the bottom of a page for example and not surrounded by content. Would the benefit be 1 to 10 or 1 to 1.5 ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
Internal Duplicate Content Question...
We are looking for an internal duplicate content checker that is capable of crawling a site that has over 300,000 pages. We have looked over Moz's duplicate content tool and it seems like it is somewhat limited in how deep it crawls. Are there any suggestions on the best "internal" duplicate content checker that crawls deep in a site?
Intermediate & Advanced SEO | | tdawson091 -
Domain Forwarding - SEO Impacts?
I have a site that has been active for years - thinkbiglearnsmart.com. Awhile ago I had purchased about 50 domain names that were relevant to my company. I still have those urls and would like to use them to point to different pages on my site - just because they have good key words in the URLs. For example - one is dreamweavertrainingclassesonlinelive.com. Currently they are all redirecting to my homepage. A. is that hurting me? B. I would like to redirect to the more relevant page. ie the page dedicated to Dreamweaver training (http://thinkbiglearnsmart.com/dreamweaver-creative-cloud-training-course/ ) Will this hurt my Dreamweaver keyword for example because there is already a 301 redirect on that page from a very old Dreamweaver link which was something like thinkbiglearnsmart.com/dreamweaver C. On my hosting account where I can select where the URL forwards to - it has an option for "Location forwarding" and "Frame forwarding" - currently they are set to Frame forwarding - which one is best? Any help is much appreciated!!! Thank you!
Intermediate & Advanced SEO | | webbmason0 -
medical site with no unique content
Hi I'm trying to promote an ecommerce site that sells vitamins and health goods. The site owner doesn't want to add texts in the product pages because it is medical material. therefore he Currently has non unique (duplicated) content in each product page' It is the same exact content all others have (taken From the manufacturer)' Any ideas? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Making AJAX called content indexable
Hi, I've read a bit up on making AJAX called content indexable and there seems to be a number of options available, and the recommended methods seems to chaneg with time. My situation is this: On a product pages I have a list of reviews - of which I show the latest 10 reviews. The rest of the reviews are in a paginated format where if the user clicks a "next" button, the next set loads in the same page via AJAX. No ideally I would like all this content indexable as we have hundreds of reviews per product - but at the moment on the latest 10 reviews are indexed. So what is the best / simplest way of getting google to index all these reviews and associate them with this product page? Many thanks
Intermediate & Advanced SEO | | James770 -
Best practices for handling https content?
Hi Mozzers - I'm having an issue with https content on my site that I need help with. Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly. Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently. One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket. Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place. I guess my question is really about best practices when using https. How can I avoid duplication issues? When do I need to use rel=canonical? What is the best way to do things here to avoid heavy maintenance moving forward?
Intermediate & Advanced SEO | | CodyWheeler0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0