Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
-
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
-
Here is Google's official recommendations for website testing. According to them, no amount of cloaking is okay. Try using one of the other methods suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links with malicious anchor text. Negative seo attack
Hi, What to do with more than 300 links with a malicious anchor text that has nothing to do with my content. I am disavowing those links for the last 5 years. Some of them are directed to URLs that have been changed more than 8 years ago. How can I block this malicious behavior? Thanks in advance
White Hat / Black Hat SEO | | Arlinaite470 -
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
What is your opinion on link farm risks and how do I explain this to a client?
Hi All, I have a new monthly retainer client who still has a $600/month "linkbuilding" contract with a large national advertising/directory organization (I won't name them but I'm sure you can guess). I just got a "linking" report and it's filled with garbage: Comment spam (on huffington post). Fake G+ Account Links from multiple sites with Domain Authority of 1 (http://encirclehealth.net/, http://livingstreamhealth.co/ , etc). These have no "about" sections, no ads, no products - just blatant link farms. I've told the client that these links pose a danger in Google, that he should get them to remove them, and that he should request a refund. Their rep is pushing back hard and saying there's absolutely nothing to worry about. Am I overestimating how bad/dangerous these are? How would you explain to the client the risks? I've already shared a report and my recommendations with the client but am really just looking for some affirmation of my position that these MUST get removed. Any advice much appreciated!
White Hat / Black Hat SEO | | PlusROI0 -
Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company. As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking. I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot. What do you think?
White Hat / Black Hat SEO | | eTundra0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0