Can horrific grammar and spelling in comments hurt the value of an otherwise great page?
-
I've got a website whose pages get lots of comments. Tons of activity, which I would think Google would like (and seems to like). However -- I just can't put this nicely -- most commenters are not very bright. Their grammar and spelling is horrific. These are not foreigners who lack English skills, they are just about all primarily English speakers and the site is 99% US traffic. It's a low-income segment of the population.
So, I've been wondering recently if Google will mark down the value of the page due to the bad grammar and spelling in the comments, even if the page's content is otherwise very good and lengthy. I have read that they grammar and spelling into consideration when looking at the page, but would that include comments, or would they know they are comments and not judge a page on that?
It would be a pain, but maybe I should I run all the comments at least through a spell checker? And manually fix their grammar? Problem is I get about 40 comments a day.
And when I say bad grammar and spelling, I mean REALLY bad. Embarrassing.
-
I would argue that if this is the type of person who could be your customer you can keep it as is and let google index it. As you said, it is helping you in the short term. Upon any manual review, it would seem that it would pass easily, and that it is not any type of auto-generated spam or produced with the intention of manipulating pagerank or search results.
Could it get you filtered in the future? Maybe, but more likely maybe not. Is it helping bring in more of the same kind of people? I think it is. But I don't think you have a reason to shy away from such legitimate engagement.
-
I just thought I'd show you all what a typical comment looks like. as bad as this is, it's only about average. They can be much,much worse. And yes, it came in all caps, as many do.
I JUST GOTTEN THIS BUGET PHONE A COUPLE OF WEEKS AGO AND I WANT ,,,, TO USE IT BUT I CN'T CONNECT WHAT SHOULD , I DO I NEED A PHONE REEL BAD CAUSE I'M THU THE GOVERMENT PROGRAM AND I DO HAVE MEDICAL-CAL AND I NEED A PHONE TO CONECT PEOPLE AND I'M ON A PROGRAM WITH THE COUNTY OF LOS ANGELES .AND IF I NEED TO B CONTACT I CAN'T SO WHAT I DO . I BEEN CALLING THE PHONE BUT THERE NO ANSWERS I'M VERY CONFUSE...
-
Good point. Some of the comments are unintelligible, so I might want to manually make them sound like at least a third grader wrote them.
The good thing about our not-so-bright readers is that they get confused and click lots of ads.
-
Flavour - schmavour!
the only thing in my world that hurts conversions - is the inablity to communicate! if the bad grammar does that, then I'd find a way to change it to work for you - rather than against you....
-
Leave the comments as is and do not worry about spelling. Google understands spelling mistakes and I don't think you'll receive any kind of penalty for it.
In fact, if the people commenting are the type of people you want on the site, then the mispellings will work in your favor. That same demographic is typing those horrific misspelled words in Google search and you have them right on page.
-
If the comments are useful (contextual relevant) and don't look spammy, leave them as is.
Will fixing the misspellings hurt you? Most likely no. A Google patent states: "content deemed to be unimportant if updated/changed, such as...comments...may be given relatively little weight or even ignored altogether when determining UA"
However, I would probably leave as is.
-
I don't think so. I have a feeling the grammar/spelling adds to the flavor of your search results and that they help to bring in more folks who are likely to engage in similar ways. However, if the people commenting are not your target audience, that would be a different issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
FAQ page structure
I have read in other discussions that having all questions on an FAQ page is the way to go and then if the question has an answer worthy of its own page, you should abbreviate the answer and link to the page with more content. My question is when using some templates in WP, they have a little + button you can click and it reveal the answer to the question. Does this hurt SEO versus having all text visible and then using headers/subheaders? An example of the + button https://fyrfyret.dk/faq/
On-Page Optimization | | OrlandSEO1 -
Noindex pages being indexed
Hi all Wondering if anyone could offer a pointer on a problem i am having please. I am developing an affiliate store and to prevent problems with duplicate content I have added name="robots" content="NOINDEX,FOLLOW" /> to all the product pages to avoid google penalties. However, Google appears to be indexing product pages. When I do a site: search I see a few hundred product pages in the engine. This is odd as the site has always had noindex on these pages. Even viewing the cache of the indexed page shows the noindex meta tag to be in place. I'm at a loss as to why these pages are being indexed and could do with removing them asap to stop any penalties on the site. Many thanks for any help.
On-Page Optimization | | carl_daedricdigital0 -
Too Many On-Page Links
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture. That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation. I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend. Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway? Please advise,
On-Page Optimization | | Aviatech0 -
Imiges on own page???
Can somebody help me to explain why when I click on an image in a post it goes to its own page?? And why if I click on other image it does not go anywhere?? this is one of the posts: http://villasdiani.com/beach-wedding-at-a-luxury-boutique-hotel/ what is good or bad?? On most of the images SEOmoz is reporting that I am missing meta description!
On-Page Optimization | | VillasDiani0 -
Canonical to the page itself?
Hello, I'd like to know what happens when you use canonical to the same page itself, like: Page "example.com" rel canonical="example.com" Does that impact in something? Bad or good? See ya!
On-Page Optimization | | seomasterbrasil1 -
If I have too many on-page links can I reduce it with nofollow tags or do the links have to be removed?
On my site I have a top nav drop down menu but once visitors go to one particularly large subsection, that menu is repeated on the left for easier viewing. As a result, I shoot over 100 links on page. Can I put nofollow or noindex tags on the left side links and reduce my "official" on-page links count or do I have to actually eliminate some of the links? Thanks, Oak
On-Page Optimization | | CSA-2316710 -
Optimization of home page
Hi there I have an issue which, despite searching hard, I simply cannot find the right solution for. We have an index page that used to rank pretty well for a main industry keyword. However following a revamp of the site last year the kw slipped and no longer brings in decent traffic levels. The problem seems to be that the old static site had a sprinkling of variable anchor text links that brought value to the home page. Instead of the main anchor being "home" we would revert to "main keyword" and variations across the site sometimes in t he content but mainly on the nav bars. However the new CMS design structure restricts us considerably with anchor distribution and so instead we opted for the site logo on the masthead to have an ALT tag for "main keyword" but so as not to game google too much we added .."home" to the tag. Probably pointless but we figured it could do no harm. This ALT text is site wide Problem now is that we have lost the spread of internal nav bar anchors and variety etc. We have slipped in the serps for "main keyword" and I cant help thinking we are not maximising the anchors as we should. So what Im coming to is this.... How can we tell if Google is picking up the ALT tage anchor as the main anchor to rank the site at the expense of all internal text anchors. Despite retaining lots of embedded anchors - according to the Moz metrics these are not being picked up because OSE suggests the ALT tag anchor is taking precedence. The serps probably support this view as well. Should we: a) Vary the masthead ALT if there is no way of avoiding this being the most important link / anchor on the page b) Remove the ALT anchor and instead opt for content links high on the page (we do have nav bar links saying "Home" site wide as well which may overrid the embedded links?) c) Leave the ALT alone and still push for content anchors as described in b) What is the best way to handle this..? Best wishes and thanks Morch
On-Page Optimization | | Morch0