Is using Mechanical Turk to increase our number of Google +1's Black Hat?
-
Or will it affect our ranking positively or negatively at all?
-
And it's not allowed by Mechanical Turk, either.
https://www.mturk.com/mturk/help?helpPage=policies
<a name="#violation_examples">What are some specific examples of HITs that violate Amazon Mechanical Turk policies? [excerpted]</a>
- HITs that directly or indirectly promote a site, service, or opinion
- HITs that violate the terms and conditions of an activity or website (for instance asking Workers to vote for something)
- HITs that generate "referred" site visits or click-through traffic
- HITs that ask Workers to take action to manipulate a website's behavior or results
-
Black hat in my book....be interested to hear how you get on if you do try this strategy though.
-
Um, short answer... yes. A darker shade of gray.
-
Better to produce great content on the site so that people +1 you naturally.... If the site isn't up to scratch then you won't get many conversions anyway.
-
If black hat is defined as using techniques to fool the search engines, then yes. If not, it's at least grey.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
How to get rid of black hat links?
I have recently discovered that one of my clients has either been sabotaged or has done this himself. In the case that he didn't do anything, how do you go about getting rid of bad links? There is now over a 1000 bad links linked to his site, do I report them as spam or what is the best way to fix this?
White Hat / Black Hat SEO | | StrategicEdgePartners0 -
Is Google stupid?
Why does buying links still work? I don't mean approaching an individual webmaster and cutting a deal, that seems to be nearly impossible to detect. But the huge link brokers, like Text Link Ads, Build my Rank or Linkvine, Google has to be aware of them, right? Can't they just create accounts to see the whole network, and ban the sites? Why wouldn't they just do that?
White Hat / Black Hat SEO | | menachemp0 -
Why is Google not punishing paid links as it says it will?
I've recently started working with a travel company - and finding the general link building side of the business quite difficult. I had a call from an SEO firm the other day offering their services, and stating that they had worked with a competitor of ours and delivered some very good results. I checked the competitors rankings, PR, link profile, and indeed, the results were quite impressive. However, the link profile pointed to one thing, that was incredibly obvious. They had purchased a large amount of sidebar text links from powerful blogs in the travel sector. Its painfully obvious what has happened, yet they still rank very highly for a lot of key terms. Why don't Google do something about this? They aren't the only company in this sector doing this, but it just seems pointless for white hats trying to do things properly, then those with the dollar in their pockets just buy success in the SERPS. Thanks
White Hat / Black Hat SEO | | neilpage1230 -
Is this proposal white hat or likely to harm me in the long run?
Hi, I'm considering outsourcing some SEO to a company I got a first month trial sweetener deal with. I've not done this before and am a little unsure about what they propose doing, not sure if I'm being a bit paranoid or too controlly. Details of what they propose: Send them 10 keywords we're interested in ranking for. Work they will perform:
White Hat / Black Hat SEO | | shabbychicoriginals
-Submit site to all major search engines
-Submit 20 social book marks for site
We'll produce 1 article + 19 spun variations of the article submitted to:
-30 directory sites
-10 press release sites and distribution networks Business Submitted to 5 business directories
5 social networks created Work and ranking report highlighting what has been done at the end of the month. Most of the stuff I've done already or can do myself. The elements that make me a bit suspicious are the: - 1 article plus 19 spun variations? 5 social networks created? What does that even mean? I did get this for about £20 for the 1st month with no commitment afterwards so I am tempted to let them try. But should I be a bit wary it might do more harm than good in the long run? Any advice\opinions would be much appreciated.0 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0