Mobile UX Marathon: Introduction to A B testing

April 27, 2024
 -  
Mete Karagozlu
Mete Karagozlu

Transcript:

Hello and welcome to the Mobile UX Marathon, a series of weekly webinars focused on improving user experience and conversion rates on mobile websites. Today's webinar, titled "Introduction to A/B Testing," will cover the essentials of A/B testing.

As you watch, I encourage you to take notes and jot down any questions that arise. You can share your questions with us on the UX Marathon website, and later join the live stream to get answers. The website also contains more information about upcoming live streams, resources, and decks. You can visit the site by clicking the link below or by checking the description.

Before we begin, let me introduce myself. My name is Mehta, and I've been working on the Mobile UX team at Google for the past year. Prior to that, I was a Product Manager and UX Lead, giving me five years of experience in A/B testing, during which I've conducted over 500 tests. Today, I’ll be covering the following points: why A/B testing is important, what A/B testing is, and how to prioritise your tests.

As I mentioned earlier, there will be live stream sessions where we’ll discuss A/B testing case studies using Google Optimise and answer your submitted questions. If you're ready, let's begin.

This is an introductory webinar on A/B testing, and I also recommend checking out the content delivered by my colleague, Mya Bilic, who will focus on running tests and personalization using Google Optimise.

Throughout the year, you might run various marketing campaigns, such as those for Valentine's Day, back-to-school season, Black Friday, or Cyber Monday, and in between, we are always striving to improve our websites. For instance, if you launch a new feature or make a small change to your website in August, you'll need to determine your performance benchmark. Should you compare it to July’s commercial values?

I doubt it because during the festival season, we were running specific campaigns on our webpage, making it difficult to draw accurate comparisons. Should we compare the performance to the previous year's August? That's also tricky because we've made numerous changes over the past year, so naturally, we'd expect different results. This is our current dilemma—we can't accurately measure and compare the performance of our new features because we'd be comparing different periods with different variables.

To compare performance effectively and objectively, we need certain conditions: users visiting our webpage should see the exact same hero banners, products on sale, and promotions. Even factors like weather conditions should be identical, as a sunny day could yield different results than a rainy day. The way we can achieve these consistent conditions for our users is through A/B testing.

A/B testing allows us to compare the performance of our website or specific sections under identical circumstances. The GIF here illustrates this well: we make a small change to a section of our webpage and compare its performance to the original. This is the most common A/B testing method, known as A/B(n) testing, where only one section is altered, as shown with blue and red images.

However, there are two other types of A/B testing. The second is split or redirect testing, where we change the entire structure of the webpage and direct users to a completely different URL and UX experience. This method is often used for campaign-specific landing pages and is popular among marketers.

The third and most complex type is multivariate testing. It might initially seem similar to A/B(n) testing, where we compare different versions of a single element, but in multivariate testing, we simultaneously test different variations across multiple sections. For example, we might change an image while also testing different wording for headings. As the number of variants increases, the complexity of the test grows, so I recommend running multivariate tests only when you're ready to handle the increased complexity.

If you have enough traffic to support multivariate testing, you'll be dividing your traffic into smaller segments. We've covered the basics of A/B testing, so now let's look at a real case. Imagine a test where we're changing the colour of the CTA button: on the left, it's orange, and on the right, it's blue. Take a moment to think about which colour you think performed better.

The winner is the orange CTA. If you guessed correctly, don't get too confident—usually, when I ask this question, the responses are split 50/50 or 60/40. This illustrates the importance of A/B testing because outcomes are often unpredictable, and A/B testing helps eliminate our biases. Statistically, only 30% of A/B tests are successful. Without A/B testing, you risk a 70% chance of implementing changes that have no positive impact, or even a negative impact, on your revenue. That's why A/B testing is so critical.

In the previous test, we just changed the colour. But is there a way to improve the success rate of A/B tests? The answer is yes. By understanding the colour contrast, like using a colour wheel, you can make more educated guesses. For example, since the primary colour on our webpage is green, using orange—a contrasting colour—makes the CTA stand out more, increasing the chances of success.

A/B testing is part of a larger loop. To run successful tests, start with a strong hypothesis, which can be based on quantitative or qualitative research, heuristic evaluations, or even your instincts. My colleague Louis Berry’s webinar on the Zero Maturity Model framework dives deeper into different research methods.

To create a solid hypothesis, follow this framework: start with a "what-if" question. For example, "If I change the CTA colour to orange, I predict the click-through rate will increase due to the contrasting colour will attract more attention." The reason should be based on data or research, not just personal preference.

Another key point is to define success beforehand. In our case, success was measured by click-through rate, but it could be conversion rate or average order value in other tests. If you don’t define success criteria upfront, you risk misleading yourself by focusing on irrelevant metrics. Always stay focused on the key performance indicator (KPI) you set for the test's success.

This is crucial because even with strong hypotheses, some tests will still fail—that's just the nature of A/B testing. However, when a test doesn’t show a positive outcome, ask yourself: Did the hypothesis itself have flaws, or was there an issue with the test execution? For instance, the hypothesis about using a contrasting colour might have been correct, but if there was a flickering effect or the A/B testing code wasn't prioritised, causing a delay in the colour change, the test results might have been skewed.

Always question both the hypothesis and the execution when running A/B tests. This approach helps ensure that any failure is a learning opportunity.

That concludes the content I wanted to cover. Here are some resources you might find useful:

  • A presentation on "Seven Things I Wish I Knew Before Starting A/B Testing," where I discussed hypothesis creation, time frameworks, and tracking the right KPIs.
  • Mobile-specific resources, including a dedicated section on A/B testing with great video content.
  • The Optimise Help page, where you can find answers to any questions about running A/B tests with Google Optimise.

Remember, we’ll be hosting live streams featuring Google Optimise case studies on A/B testing. If you have any questions, please submit them on the UX Marathon website, and we’ll address them during the live sessions.

Thank you very much for your time, and this concludes the webinar series on "Introduction to A/B Testing."

You might also like:

Like the resources? Schedule a call with our UX experts and chat with them in person!

Product excellence is a culture, not a checklist
Vincent Mo, Manager of Google Photos/GoogleHome
Call To Action Digital - Time to optimise your design
Our Policies
Socials
CTAD LinkedIn page logoCTAD X page logo
© 2023 Call To Action Digital. All rights reserved.