Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
-
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
-
Here is Google's official recommendations for website testing. According to them, no amount of cloaking is okay. Try using one of the other methods suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wanna see Negative SEO?
One of my clients got hit with negative SEO in the past few days. Check it out in ahrefs. The site is www.thesandiegocriminallawyer.com. Any advice on what, if anything, I should do? Google disavow? Thanks.
White Hat / Black Hat SEO | | mrodriguez14401 -
Soliciting Product Reviews with Free Samples?
I have been looking at my competitors links and I have discovered that a competitor with top positions in the SERPs has been gaining links by offering free product samples to bloggers in exchange for links within the review back to their site. My question is, does Google frown on this? Can it invoke a penalty? To me it seems tantamount to buying links, but yet his results speak for themselves. It is something I intend to start doing myself if I am sure it won't result in a penalty. Thanks.
White Hat / Black Hat SEO | | RocketBanner0 -
Googlebot stopped crawling
Hi All, One of my website stopped showing in SERP, after analysing in webmaster, found that Googlebot is not able to crawl. However it was working alright few days back. Try to investigate for panelisation, but no intimation found. I checked robot.txt for no follow etc but all seems to be ok. I resubmitted Sitemap in webmaster again, it crawled 250 pages out of 500 but it still site is not available in SERP (google), in bing it is ok. Pl suggest the best possible solutions to try. Thx
White Hat / Black Hat SEO | | 1akal0 -
Negative SEO from Spammers Killing Client Rankings
Hi - I have identified a client website which was; a ) hacked and had several fraudulent pages added e.g. www.xxx.com/images/uggaustralia.html added which have 301 redirect links to another fraudulent websites. b) had an auto generated back link campaign (over 12k back links at present) with targeted anchor text at cheap ugg boots, ugg sale etc. I've removed the dodgy redirect web pages and also undertook a link audit using Google WMT, OSE and Seo Majestic and have disavowed all the spammy links at domain level. Consequently my client has dropped from top three for the key phrase to #9. Google WMT now sees ugg boots uk, ugg boots sale etc. as some of the most popular anchor text for the site even though it's blatantly obvious that the site has nothing to do with Ugg boots. No manual webspam penalties are in place however the auto generated anchor text campaign is still ongoing and is generating more spammy links back to non existent web pages - which still Google appears to be picking up. Question is - how long do you reckon it will take for the links to disappear and is there anything I can speed Google along as this issue if not of my making? p.s. For the record I've found at least 500 sites that have been targeted by this same campaign as well.
White Hat / Black Hat SEO | | Door4seo0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Negative SEO on my website with paid +1's
Hi guys, I need a piece of advice. Some scumbag played me quite well with paid +1's on my two articles and now I'm in a problem.
White Hat / Black Hat SEO | | Fastbridge
http://sr.stateofseo.com/seo-vesti/google-implementacija-ssl-protokola-not-provided-problem/
http://sr.stateofseo.com/napredni-seo/najnovije-promene-google-panda-algoritma/
They are both translated articles (written originally by me on the same website). I've noticed those +1's (476 on both articles) when my website received a penalty for "SEO" keyword on Google.rs (Serbian Google) and I'm now on the 11th page.
Other keywords still rank just fine. Not cool, right? Now, I think there could be two solutions:
First one is to remove my inner link that's pointing to my homepage with "SEO" anchor, and hope for the best. Second one is to completely remove/delete those two articles and wait for Google to reindex the website and hopefully remove my ban. Do you guy have some other ideas how can I fix this or remove / disavow those +1 or somehow explain to the Google crew / algo that I'm just a humble SEO without any evil thoughts? 🙂 Thank you in advance.0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090