Is bad English detected by Google
-
Hi,
I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO.
They have sent me some sample artilces that they have written for link building and the English is not good.
Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link?
Any input would be much appreciated.
Regards
John J
-
Thanks for the responses. I think I will stay away from the Indian SEO companies.
It really was for link building and not onsite stuff but it still does not seem like the best way forward.
Regards
John
-
Matt Cutts has stated in the past that poorly translating pages into another language (i.e. dumping out a raw translation) could get you devalued. Now, he's talking primarily about duplicate content but it seems that he's hinting that poor grammar could also play a role in evaluations. At the bare minimum, it could affect your bounce rate, a known SEO factor.
Let's put aside the SEO role for a second. I'm a customer who just found your site, written by your India firm. The grammar looks worse than my daughter's (she's in first grade) and is a chore to read, let alone understand. Am I going to stay and listen to/buy anything else on your site? Nope. I'll go to your competitor or I'll just give up. And you can forget any tertiary SEO benefit of my linking your article except to ridicule it. From a business standpoint it doesn't make sense. It's sloppy and people hate sloppy (unless you're selling really good hamburgers, which you're not).
If you still don't think it's important, check out Engrish. I hope you don't wind up there!
-
I agree w/ @kevinb. Google & Bing track results like high user engagement, low bounce rates, etc. Check out the infographic below.
If these articles aren't useful to users, Google will notice.
-
Awkward syntax and poor or incorrect use of idiom erect roadblocks to the flow of a narrative, depreciating the user experience.
It's been my experience that when a writer attempts to replicate a particular cultural context that is not natural to him or her, the user will recognize its artificiality—even if only on a subconscious level. An analogy would be a motion picture with dubbed—rather than subtitled—dialog: There's something that's just off.
According to Google user experience trumps, doesn't it? (See, I used an idiom right there!) So, for what its worth my advice would be to stay away.
-
Even if Google can't detect poor English now, it will be working towards it.
Surely your money is better spent elsewhere. Invest in the long term.
If the articles they are writing for you are low quality, you can bet the sites they are able to get them on are low too.
Keep away from them and work on quality. Nothing is quick and easy and that's how it should be. If people could so easily buy their way to the top, the search results wouldn't be worth using.
-
Do yourself a favour, stay away from this out-dated and damaging technique!
Create some amazing content on your own site/blog......examples could be how to reduce insurance costs when leasing a van or the best vans to hire for home removals etc etc.
Make your content the go to source for that particular problem then start contacting other webmasters of similar (non-competitor) sites to share/link so their readers benefit!
The game has changed a lot from when you could buy 50 articles from Indian SEO firms for less than £20 and churn out for links from low quality sites!
-
Wesley & Jesse hit the nail on the head. Don't do it. Even if Google possible can't detect it directly, they can spot it indirectly in the means of user experience.
Is the only reason you are using this team is price?
-
I'm not sure if Google if able to tell the difference between good or bad English at this moment.
But i do know that this is one of the criteria which they want a website to rank as is described in this document about Google Panda: http://googlewebmastercentral.blogspot.nl/2011/05/more-guidance-on-building-high-quality.htmlThis method is not permitted though and you may have a benefit for this on the short term, but i can tell you that it won't be long before you will get a penalty for this technique. Link building is not about buying links in any form. It's about creating awesome content that people want to share just because they think it is awesome.
Of course reaching out to people is also part of the process. But the key is always to make sure that you have to create a site that people **want **to link to because it is awesome of because their website will get better from it because your website offers great value to their visitors.
Always keep this in mind
-
What Google definitely does recognize is the exact services you are considering. Google's webspam team developed Penguin specifically to target sites that have subbed out SEO to blackhat organizations. What you are describing is exactly what they are targeting.
Don't do it! You WILL be sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Would Headspace Plug-in be a bad idea?
We use the plug in headspace for some posts because some things we want to show in a certain way on our site ie with a certain title but we want the title to be more descriptive for google. It used to work really well but now I have noticed a lot of posts that used to do really well in search being flagged up for multiple meta description and headers that I wondered wether they would be harming the site's query stats? Does anyone think that after the penguin/panda updates etc using headspace might be a negative option?
White Hat / Black Hat SEO | | luwhosjack0 -
Someone COPIED my entire site on Google- what should I do?
I purchased a very high ranked and old site a year or so ago. Now it appears that the people I purchased from completely copied the site all graphics and content. They have now built that site up high in rankings and I dont want it to compromise my site. These sites look like mirror images of each other What can I do?
White Hat / Black Hat SEO | | TBKO0 -
Anybody have useful advice to fix a very bad link profile?
Hello fellow mozzers. I am interested in getting the communities opinion on how to fix an extremely bad link profile, or whether it would be easier to start over on a new domain. This is for an e-commerce site that sells wedding rings. Prior to coming to our agency, the client had been using a different service that was doing some serious black hat linkbuilding on a truly staggering scale. Of the roughly 53,000 links that show up in OSE, 16,500 of them have the anchor text "wedding rings", 1,300 "wedding ring sets", etc. For contrast, there are only two "visit website", and just one domain name anchor text. So it is about the farthest from natural you can get. Anyway, the site traffic was doing great until the end of February, when it took a massive hit and lost over half the day to day traffic volume, and steadily declined until April 24th (Penguin), when it took another huge hit and lost almost 70% of traffic from Google. Note that the traffic from Yahoo/Bing stayed the same. So the question is, is it worth trying to clean up this mess of a backlink profile or would it be smarter to start fresh with a new domain?
White Hat / Black Hat SEO | | CustomCreatives0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0