Is A/B Testing allowed by Google?
Am I going to get penalized in Google Search results because of cloaking?
This article guides you through the definition of cloaking, provides some cloaking examples and then instructs you how to avoid Google's cloaking penalties by creating your experiments in Convert Experiments.
What is Cloaking
Google suggests that if they detect cloaking on your site you may be removed entirely from the Google index.
So what does cloaking mean? Simply put, you display different content to search engine bots and to humans, with the purpose of manipulating your search engine rankings. Most of the cloaking scripts are identifying the IP of the user agent (humans or search engine bots) and based on a predefined list of IPs of search engine bots will guess if the visitor is a search engine or a human. Other scripts use "traps" to identify robots. Based on who’s visiting your site, you can setup your web server to serve the tricky content to the search engine and nice looking content to the human.
Some historical information on cloaking if you are interested:
Cloaking content was mentioned in the early days of search engines (around 1990s). In those days, SEO was all about keyword stuffing. So, smart SEO geeks used to display a page full of keywords when a search engine bot visited to crawl / index the page. While if a user visited the same page, they displayed the default (normal) version. This strategy of keyword stuffing used to work like wonders so naturally search engines started devising clever ways to detect and penalize such cloaking. But thanks to Google’s PageRank algorithm, keyword stuffing no longer works.
Examples of Cloaking
Some examples of cloaking include:
- Serving a page of HTML text to search engines, while showing a page of images or Flash to users
- Inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor
Use Convert Experiments to avoid Cloaking
The Convert experimental setup is designed specifically to avoid any issues with cloaking, as long as you keep a few things in mind.
Do not distinct on Googlebot User-Agent
As long as you are not basing your traffic allocation on a search engine vs human distinction (using Googlebot user-agent in Audiences feature) you will not be punished for cloaking. Google doesn’t care if their bot sees one version or another; it just cares that its bot has the same user experience as that of a random visitor.
If an A/B test has multiple URLs, place the rel=canonical link attribute on all of your alternate links, pointing to your original page. This will help point bots indexing your website to your original page. Experiments involving redirects should be fine as long as they don't redirect to unexpected or unrelated content.
Use 302s for redirect
Only run the experiment "as long as necessary"
You can run an experiment if there is sufficient traffic to justify it in the first place, and either shut it off if it fails to reach a conclusion within your estimated timeframe or immediately after it does.