Is bad English detected by Google
-
Hi,
I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO.
They have sent me some sample artilces that they have written for link building and the English is not good.
Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link?
Any input would be much appreciated.
Regards
John J
-
Thanks for the responses. I think I will stay away from the Indian SEO companies.
It really was for link building and not onsite stuff but it still does not seem like the best way forward.
Regards
John
-
Matt Cutts has stated in the past that poorly translating pages into another language (i.e. dumping out a raw translation) could get you devalued. Now, he's talking primarily about duplicate content but it seems that he's hinting that poor grammar could also play a role in evaluations. At the bare minimum, it could affect your bounce rate, a known SEO factor.
Let's put aside the SEO role for a second. I'm a customer who just found your site, written by your India firm. The grammar looks worse than my daughter's (she's in first grade) and is a chore to read, let alone understand. Am I going to stay and listen to/buy anything else on your site? Nope. I'll go to your competitor or I'll just give up. And you can forget any tertiary SEO benefit of my linking your article except to ridicule it. From a business standpoint it doesn't make sense. It's sloppy and people hate sloppy (unless you're selling really good hamburgers, which you're not).
If you still don't think it's important, check out Engrish. I hope you don't wind up there!
-
I agree w/ @kevinb. Google & Bing track results like high user engagement, low bounce rates, etc. Check out the infographic below.
If these articles aren't useful to users, Google will notice.
-
Awkward syntax and poor or incorrect use of idiom erect roadblocks to the flow of a narrative, depreciating the user experience.
It's been my experience that when a writer attempts to replicate a particular cultural context that is not natural to him or her, the user will recognize its artificiality—even if only on a subconscious level. An analogy would be a motion picture with dubbed—rather than subtitled—dialog: There's something that's just off.
According to Google user experience trumps, doesn't it? (See, I used an idiom right there!) So, for what its worth my advice would be to stay away.
-
Even if Google can't detect poor English now, it will be working towards it.
Surely your money is better spent elsewhere. Invest in the long term.
If the articles they are writing for you are low quality, you can bet the sites they are able to get them on are low too.
Keep away from them and work on quality. Nothing is quick and easy and that's how it should be. If people could so easily buy their way to the top, the search results wouldn't be worth using.
-
Do yourself a favour, stay away from this out-dated and damaging technique!
Create some amazing content on your own site/blog......examples could be how to reduce insurance costs when leasing a van or the best vans to hire for home removals etc etc.
Make your content the go to source for that particular problem then start contacting other webmasters of similar (non-competitor) sites to share/link so their readers benefit!
The game has changed a lot from when you could buy 50 articles from Indian SEO firms for less than £20 and churn out for links from low quality sites!
-
Wesley & Jesse hit the nail on the head. Don't do it. Even if Google possible can't detect it directly, they can spot it indirectly in the means of user experience.
Is the only reason you are using this team is price?
-
I'm not sure if Google if able to tell the difference between good or bad English at this moment.
But i do know that this is one of the criteria which they want a website to rank as is described in this document about Google Panda: http://googlewebmastercentral.blogspot.nl/2011/05/more-guidance-on-building-high-quality.htmlThis method is not permitted though and you may have a benefit for this on the short term, but i can tell you that it won't be long before you will get a penalty for this technique. Link building is not about buying links in any form. It's about creating awesome content that people want to share just because they think it is awesome.
Of course reaching out to people is also part of the process. But the key is always to make sure that you have to create a site that people **want **to link to because it is awesome of because their website will get better from it because your website offers great value to their visitors.
Always keep this in mind
-
What Google definitely does recognize is the exact services you are considering. Google's webspam team developed Penguin specifically to target sites that have subbed out SEO to blackhat organizations. What you are describing is exactly what they are targeting.
Don't do it! You WILL be sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Traffic Generation Visitor Exchange Systems & Google Algo / Punihsments
So, in recent years some services have been developed such as Engageya I want to ask the experts to weigh in on these types of services that generate traffic. I know of sites that have achieved higher ranking via these NON-bot, user browser visitors. Here's their own explanation. Any thoughts will be appreciated. I could not find what Google's Matt Cutts has to say about these affairs, I suspect not very good things. However, I KNOW of sites that have achieved higher ranking, with about 30-40% of traffic coming from similar systems to this. Join our exclusive readers exchange ecosystem Engageya offers an exclusive readers exchange ecosystem - either within the network only, or cross-networks as well - enabling participating publishers to exchange engaged readers between them in a 1:1 exchange ratio. No commissions involved! Why networks work with Engageya? Create traffic circulation within your network - increase your inventory and impressions within your existing properties.Engage readers within your network and experience an immediate increase in network's page views. Enjoy readers'- exchange from other networksOur engine intelligently links matching content articles together, from within your network, as well as from other networks. Get new audiences to your network for non-converting users clicking out. New revenue channel - monetize pages with reader-friendly content ad units, while making your readers happy!This is the time to move from aggressive and underperforming monetization methods - to effective and reader-friendly content advertising.
White Hat / Black Hat SEO | | Ripe
Let our state-of-the-art semantic & behavioral algorithms place quality targeted content ads on your publisher's content pages. Enjoy highest CTRs in the industryContent ads are proven to yield the highest CTRs in the industry, starting at 2% and up to 12% click-through rates! This is simple. Readers click on an article they are interested-in, whether it's sponsored or not. Enhance your brand - Offer your publishers private-label content recommendations today, before someone else does.Content advertising is becoming more and more common. New content advertising networks and suppliers are being introduced into the online advertising market, and, sooner or later, they are going to approach your publishers. Engageya offers you a private-label platform to offer your publishers the new & engaging content ad unit - today! Comprehensive reports and traffic control dashboardTrace the effectiveness of the content recommendations ad units, as well as control the traffic within your network.0 -
Many Regional Pages: Bad for SEO?
Hello Moz-folks We are relatively well listed for "Edmonton web design." - the city we work out of. As an effort to reach out new clients, we created about 15 new pages targeting other cites in Alberta, BC and Saskatchewan. Although we began to show up quite well in some of these regions, we have recently seen our rankings in Edmonton drop by a few spots. I'm wondering if setting up regional pages that have lots of keywords for that region can be detrimental to our overall rankings.Here is one example of a regional page: http://www.web3.ca/red-deer-web-design Thanks, Anton TWeb3 Marketing Inc.
White Hat / Black Hat SEO | | Web3Marketing870 -
Google Reconsideration Requests no problem... So what do I do next?
Hi all, So I recently filed a Google reconsideration request - but it came back saying "No manual spam actions found" - ok, so that's that. But from what I've read, if we have been hit by Panda for duplicate or thin content, we wouldn't know - in other words, Google does not report it as it is an algorhythm penalty as opposed to a manual one. So what are my options - do I wait until the next Panda update? when can that be? Or do I start over on a fresh domain? Input and views appreciated. thanks,
White Hat / Black Hat SEO | | bjs20100 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
Google Penguin w/ Meta Keywords
It's getting really hard filtering through the Penguin articles flying around right now so excuse me if this has been addressed: I know that Google no longer uses the meta keywords as indicators (VERY old news). But I'm just wondering if they are starting to look at them as a bigger spam indicator since Penguin is looking at over-optimization. If yes, has anyone read good article indicating so? The reason I ask is because I have two websites, one is authoritative and the other… not so much. Recently my authoritative website has taken a dip in rankings, a significant dip. The non-authoritative one has increased in rankings… by a lot. Now, the authoritative website pages that use meta-keywords seem to be the ones that are having issues… so it really has me wondering. Both websites compete with each other and are fairly similar in their offerings. I should also mention that the meta-keywords were implemented a long time ago… before I took over the account. Also important to note, I never purchase links and never practice any spammy techniques. I am as white hat as it gets which has me really puzzled as to why one site dropped drastically.
White Hat / Black Hat SEO | | BeTheBoss0 -
Penalized In Google ?
Hello Guy´s. Im terrible sad because we make an amazing SEO job for this client: www.medabcn.com And the website was hacked.. Message from the hosting platform: "It would appear that malicious individuals have found a way to upload spam
White Hat / Black Hat SEO | | maty
pages as well as backdoors to your site(s). We
have disabled the page(s) in question (via removing their permissions, e.g..
chmod) until you are able to address this matter." Result: we loose all our SERP Somebody of yours was in a similar situation ? Notes: I was on Google Webmaster an anything seem to be normal. The domain was relative new, maybe a late sandbox efect ? Thanks a lot for your help. Matias0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0