Execution Time: 30min-2h
notion image
Ideal Outcome: You’ve flawlessly planned, executed and analyzed an A/B test on your website and are able to determine whether a variant should be implemented definitively on your website or not, based on actual data.
Pre-requisites or requirements:
  • You need to have Google Optimize setup on your website.
  • You should have Google Analytics’ conversion goals setup on your website.
Why this is important: Without a proper A/B testing process in place you are not able to confidently make business or UI/UX decisions.
Where this is done: In Google Optimize;
When this is done: Every time you want to test a new hypothesis on your website;
Who does this: The person responsible for Conversion Rate Optimization;

Included Resource in this SOP

ClickMinded - A/B Test Documentation Spreadsheet
notion image

Determining what you are going to test

  1. Select the element on the landing page that you would like to test:
    1. Most common testing elements are:
      1. Headline / Sub-headline;
          • E.g: Replace “Just another WordPress site” with “5x your ROI today through custom emojis”
      2. Form positioning / Form fields;
          • E.g: Move the lead generation form from the bottom to the top of the page;
      3. Media on the page (Images and Videos);
          • E.g: Replacing the background image on your hero image;
      4. CTA (Call-to-action) text, color, and shape;
          • E.g: Changing the CTA text from “Submit” to “Get your free emoji cheat-sheet now”;
      5. Sales copy;
          • E.g: Testing a completely new sales letter, or a specific part of it;
      6. Authority / Trust elements;
          • E.g: Adding reviews to your landing page, or testing different customer reviews;
      7. Pricing;
          • E.g: Change your product pricing from $97 to $79.
    2. Find specific testing elements to your business:
      1. Recent changes that you suspect might have impacted conversion rates (for the better or worse) and want to make sure that is the case, and if so, quantify it.
          • E.g: “Pricing” page link was removed from the landing page. Conversion rate seems to have dropped after that;
Note: Although, many times, changes to your website cause a conversion rate drop/jump, that’s also many times not the case. Changes to conversion rate might be attributed to events that happened during the same period and are unrelated to the changes on your website (seasonality of your product/service, traffic quality, competition, weather, etc).
  1. Customer feedback and common customer questions.
      • E.g: Go through your support tickets or your survey feedback, if you find that your customers keep asking questions such as “How does feature X work?” you might want to try a version of your landing page that includes a section on how that specific feature works.
  1. Heatmap & Clickmap tools:
Note: If you haven’t implemented Heatmap/Clickmap tools yet follow SOP049 (web version)
  1. If you have already implemented the ClickMinded - Scrollmap & Clickmap Insights Log covered on SOP 049 (web version) go through each of your entries, look at your “Next Step” column and select one that you would like to create an A/B test for.
  1. If you have not yet implemented, go through the ClickMinded - Clickmap & Scrollmap Diagnosis Cheat Sheet to diagnose potential issues you might be having with your landing page;
    1. Session Recordings:
Note: If you have followed SOP049 (web version) for implementing Heatmaps on your Website you can use hotjar to record user sessions on your website (outside of the scope of this SOP).
  1. Watch at least 10 session recordings where a conversion happened.
  1. Watch at least 10 session recordings where a conversion did not happen.
  1. Are your users not converting due to UI issues?
    1. E.g: Typing there email addresses on the “name” field, not being able to generate a password with the requirements you have at the moment, etc. You may want to create a variation that aims at fixing those issues.
  1. Identify commonalities among converting and non-converting users (specific sections of the page they look at, specific conversion paths, etc) and hypothesize changes that could move non-converting users down the same path that the converting users are going.
    1. E.g: If you are offering plumbing services, pushing users to book straight away from the landing page, and you identify that most converting users go through your contact and reviews page before going to your checkout page, you might want to create a version of your page that includes your contact information and reviews on the landing page.

Defining how you are going to test it

  1. Open the ClickMinded A/B Test documentation spreadsheet;
      • Note: Although this spreadsheet is specially designed to be used with Google Optimize, it should be able to be used with most of the A/B testing tools available. If your A/B testing tool does not offer a specific feature (e.g: targeting specific audiences) you can always remove / edit that column to fit your specific A/B testing tool.
  1. Fill-out the spreadsheet:
      • Test #: Incremental number, this should be used internally. It is useful to communicate with your designer, programmer, copywriter, or whenever you want to mention a specific A/B test during a discussion or a Project Management tool.
          1. E.g: 001
  • Start Date / End Date: Add the Start Date whenever you start running your experiment, and update the End Date once the experiment is over. This will allow you to quickly overview which experiments are still running
      1. Note: The spreadsheet will automatically update the status of the “Running Days” column and set it to “Still Running” if no end date was added yet;
  1. Note 2: If the experiment ran for less than 14 days the “Running Days” cell will turn red to warn you that the test might not have ran for enough time for your results to be meaningful (although this will ultimately depend on how many people were exposed to your experiment during that period of time.)
  • Created by: The person responsible for this experiment;
  • Running days: Leave empty, this cell contains a formula to calculate how many days your A/B test ran and also to let you know of the A/B tests that are still ongoing;
  • Purpose: Clearly define the purpose of this test. The purpose should identify what you are going to test and why. You can use this template to fill out that cell if you don’t have any other ideas:
      1. To test if [INSERT CHANGE HERE] has a positive impact on [INSERT METRIC HERE];
        1. E.g: “To test if personalizing the headline with the user’s location has a positive impact on signup conversion rate.”
      1. Important: Do not run A/B tests without having a clear purpose in mind. There is only a limited amount of A/B tests that you can run on a given time (since you are limited by how much traffic you have to experiment on). There is also a big chance that a random A/B test without any purpose will end up underperforming the control, and therefore temporarily decrease your revenue/signups or your business performance in general.
  • Testing Element: Define which element on your page you are going to be testing.
      1. E.g: “Hero Headline”
      1. Important: You should only test one element at a time. Testing multiple elements (E.g: Changing the Headline, the Sub-headline, the copy, and the form location) at once will leave you wondering which of the changes actually had a positive impact and which of those had a negative impact since your results will only show the aggregated data. If you want to have multiple elements changed at once you should run a Multivariate Test (MVT) (outside of the scope of this SOP).
  • Audience: Define which audience you are going to be targeting on your experiment. You can run experiments only for a specific group of people. Depending on which tool you are using you might be able to target specific Devices, Countries, Traffic Referrals and Traffic Sources, Browsers, etc.
      1. E.g: All US Visitors
  • Metric #1, #2, #3: Define which metrics you want to use to evaluate the success/failure of an experiment. You should add the metrics in order of importance to the given experiment (the most important metric should go first, and the least important, last).
      1. E.g:
        1. Metric #1: Signup Conversion Rate
        2. Metric #2: Revenue
        3. Metric #3: Bounce Rate
  • Version A, Version B: Insert a URL with a screenshot of your control version (Version A), add a URL with a screenshot of your test version (Version B).
      1. If you don’t have a tool to screenshot your page yet you can use the Awesome Screenshot Chrome Extension it’s free and it offers you a way to screenshot the entire page on a single click:
  • Results: Once your A/B test has ended this is where you should log your results so that in a few months you can look back and understand how your previous experiments went, or share it with your team so they are all aware of them.
      1. E.g: “Personalizing the headline with the user's location increased signups by 36% in the US. It also increased revenue by 12% and Bounce Rate decreased 10%. The test ran for 3 weeks and a statistically significant result was reached, with version B having a probability to be best of 95%+ on all metrics.”

Starting an experiment using Google Optimize

Note: If you haven’t set up Google Optimize yet you can do it now by following SOP059 (web version)
  1. Using Google Chrome, navigate to https://optimize.google.com/ and login with your Google Account.
    1. Note: Make sure you have the Google Optimize Chrome Extension on your browser, otherwise you won’t be able to create variants.
    2. Note 2: Make sure you are not using any privacy extension (such as the Google Analytics Opt-out browser Add-on), or AdBlocker, or any extension that may block the Google Analytics or Google Tag Manager code, otherwise you won’t be able to create variants.
  1. Click “Create an experiment” on the top-right corner:
  1. On the experiment name field, you should type your “Test #” that you’ve already added to the A/B Test documentation spreadsheet plus some descriptive name for your experiment;
    1. E.g: “Exp001-US-Headline Geo-personalization”
  1. On the URL insert the URL of the testing page you’ve already defined on the previous chapter.
    1. E.g: Your homepage’s URL
  1. Select A/B Test from the list → Click “Create”;
notion image
  1. Link your Google Analytics View:
notion image
  1. Select a view from your Google Analytics account:
notion image
  1. Click “Finished”
  1. Add your experiment objective (Metric #1, #2, and #3 on your A/B Test Spreadsheet):
  1. Select “Choose from list” if your metric is already set up as a Goal on your Google Analytics view.
  1. Choose your objective:
  1. Select “Create custom” if your metric is not a Google Analytics Goal yet. This option allows you to use the Google Analytics events that you have setup on your website, as well as specific pages as experimental objectives.
    1. Note: If you haven’t already, you can implement Google Analytics events by following chapter “Setting up Google Analytics Events using Google Tag Manager” on SOP021 (web version).
    2. Note 2: This SOP assumes you already have your main objectives setup as goals on your Google Analytics account. Settings for this option will vary depending on your event and page structure. You can implement your events and goals in your view by following SOP021.
  1. If you have additional objectives, add them now (just repeat step 8 for each):
    1. Note: You can have up to 3 objectives on Google Optimize’s free plan.
  1. Copy the column “Purpose” from your A/B Test documentation spreadsheet to the “Description and hypothesis” box:
  1. If you want this A/B test to run only on a specific targeted segment, click the “Targeting” tab:
  1. Select the percentage of visitors to target (typically 100%);
  1. Select the weighting of visitors to target (typically 50% / 50%);
  1. Under “Additional conditions” click “And”:
  1. Select the type of rule that you need and configure it accordingly.
E.g for location targeting:
notion image
notion image
  1. Hit “Save” on the top right corner:
  1. A new message will appear on the sidebar prompting you to run a diagnostic of your setup. Click “Run Diagnostics”:
  1. Click “Run Diagnostics” and after a few seconds you should see a success message:

Creating a variant on Google Optimize

  1. Inside your Google Optimize experiment click “Create Variant”:
  1. Name your variant and click “Add”:
  1. Your variant will appear on the list and a red “0 changes” label will appear as well, click that variant:
  1. You will be taken to the page editor, of the page you’ve defined on the previous chapter:
  1. Select the element you want to edit by right clicking on it, and select whether you want to remove it, or edit its text.
    1. Note: There are 3 more options that you can choose from (Edit HTML, Insert HTML, and Run Javascript) but they would all require you to use code to some extent and therefore not covered by this SOP.
  1. Note 2: You can also drag and drop elements in some cases, although depending on your page and what you are trying to do, the results might not always be what you expect. If you are having trouble using the drag & drop feature, contact a web designer to do it by editing the code instead.
  1. Edit your element:
  1. Once you’re done editing your element, click “Finished” on the bottom-right corner:
  1. Repeat steps 5-7 as many times as needed until you’ve created the variant you had envisioned.
    1. Remember: You should not change multiple elements at the same time, otherwise you will not know which of the elements impacted your results positively/negatively.
  1. Once you’re done editing your page, click “Save” on the top right corner:
  1. Test how your variant looks like on smaller screens by clicking the device dropdown on the top:
  1. If your variant looks good on other screen sizes, click “Finished” on the top-right corner:
  1. You will be taken to your experiment dashboard, click the devices icon () on your variant click “Share preview”:
    1. notion image
  1. You will be taken to your experiment dashboard, click the devices icon () on your variant click “Share preview” → Copy that link and send it to your team so they can also preview your variant on different devices.
    1. notion image
    2. Note: It’s recommended that you test your variant at least on Google Chrome (Desktop & Mac) and on Safari for iOS and Chrome for Android before publishing it.
  1. Once you made sure your variant works on most devices, you’re ready to start running your experiment, click “Start Experiment”:
    1. notion image
  1. Now just click “Start” and that’s it! You are now running an A/B test on your website.

How to analyze your results on Google Optimize

Note: Ideally you will only want to analyze your results after 2 weeks have gone by to give Google Optimize enough data to analyze how your experiment performed. However, it is possible to analyze the results while the experiment is running and before the 2 weeks have passed.
  1. Inside your Google Optimize experiment click the “Reporting” tab:
notion image
  1. You will be shown the current stats of your experiment, on the table you will find how well your variant is performing in comparison to baseline. Ideally your variant will perform better than baseline for all metrics, but even if it only performs better on a specific metric it might still be worth it to implement (in this example the experiment is still running):
  1. By clicking on your objective you will be able to analyze how your experiment is performing for each objective specifically.
Definitions:
  1. Improvement: The difference in the modeled conversion rate of the variant and the baseline, for a given objective. This is the likely range in which your conversion rates will fall. (as defined by Google)
  1. Probability to be best: The probability that a given variant performs better than all other variants. (as defined by Google)
  1. Probability to beat baseline: The probability that a given variant will result in a conversion rate better than the original's conversion rate. Note that with an original and one variant, the variant's Probability to beat baseline starts at 50 percent (which is just chance). (as defined by Google)
  1. Conversion Rate: A box plot of how the collected data looks like for that variant when compared to baseline. The further away those are from each other the higher the likelihood that your variant will have a meaningful impact on your chosen metric (for the best or the worst).
  1. Conversions: The number of conversions that were attributed to that variant during the experiment period.
  1. That’s it! When your experiment has run for more than 2 weeks, has collected enough data, and has reached meaningful statistical results Google Optimize will end your experiment and declare a winner.
    1. Note: You can also end your experiment early. Before doing so though, it is recommended that you let it run for at least 2 weeks (to set aside any possible seasonality effects, and the novelty effect) and have a probability to beat baseline greater than 95%.
    2. Note 2: If your experiment has been running for 90 days and no winner has been declared yet, your experiment will be terminated, most likely that experiment would’ve never yielded meaningful statistical results.
 
Share this article

Ready to get started?

Join thousands of satisfied customers and start using our product today.