Guest Blogs

When to Use Personalization vs. A/B Testing

customer experience

Personalization is one of the most admired techniques in martech arena presently. Whelan Boyd from Optimizely talks about the perfect timing to use it, why & how

Personalization is talked about a lot. There is an expectation from users to deliver a tailored, hyper-relevant experience that fulfills their needs quickly. That may seem like a good brand aspiration, however, teams shouldn’t operate under the assumption that the more personalized an experience, the better.

Personalization should be treated as an experience optimization technique and tool vs. objective or end state.

It must be developed using data to be effective, and it has to deliver data that informs the business to be truly useful. This means customer-centric organizations should measure personalization campaigns beyond conversions to reap their full potential. After all, the data gleaned from engaging with customers around their experiences is in many ways as important to businesses as the transactions completed.

A lot of brands today utilize both personalization and A/B testing as tools to improve customer experience and conversion rates. Increasingly brands are wondering when to use one over the other. Which will provide the best return on solving customers’ problems and deliver the best experience? As you may expect, there is no one right answer so let’s dive into how they compare, how to be successful using both, and when to use one over the other.

Understanding Personalization and A/B Tests: Similar Tools in the Optimization Tool Chest

Let’s start by defining A/B testing, or “experimentation”. An experiment is defined as a procedure carried out under controlled conditions in order to discover an unknown effect. This definition can also be applied to experimenting on products and experiences. All experience optimization techniques are essentially experiments because they are all measured to inform further activity and changes needed to drive the intended action. Personalization is simply a type of experiment targeted at an individual or specific persona using existing insights about that individual or persona to drive an intended action. Both personalization and A/B testing support iterative approaches. Meaning, teams shouldn’t turn them on and walk away. They are most valuable when continually analyzed and honed.

Using Personalization and A/B Tests Successfully: Start with a Problem Statement

Regardless of running a personalization campaign or setting up an A/B test, teams should approach each as an experiment. Each should frame an idea or hypothesis and set clear success criteria against metrics that matter to the business. And yes, that means each starts with a problem statement. Example problem statements include:

  • Retail clothing: Our past purchasers consistently order in the same price range and have a lower product view rate when we show products not related to their past purchases.
  • B2B: Our ad campaign traffic has a 10 percent higher bounce rate than our other primary channels and converts to a lead in only 2 percent of visits.
  • Online grocery: Our loyalty members have shared in customer surveys that they expect us to remember their most purchased brands in their searches.

Next, turn your problem statement into hypotheses. For example:

  • Retail clothing: If we display a banner on the homepage that has a CTA for directing to the related price category page for users based on their past purchase total value, then we will increase product view rate and, in turn, purchases.
  • B2B: If the landing pages where ad campaign traffic arrives mirrors the same value propositions, offers, and CTAs viewers saw in the specific ad, then we will decrease the bounce rate and improve the lead-generation rate.
  • Online grocery: If we pre-populated the homepage search ‘Origin’ field for loyalty members based on their most recent brands searched, then we will increase search rate and in turn, bookings.

Choosing What to Use: Personalization or A/B Test?

If personalization and A/B tests are so similar, how do you know when to use one vs. the other? First, the process of turning problem statements into hypotheses can help provide an initial indication of whether personalization will be an effective tool to use vs. A/B tests. Analyzing the experiment setup and related variables can further indicate that a personalization campaign is the most effective technique to use. However, at the end of the day, running the personalization campaign in question and then measuring it is the best way to determine if it can drive the intended outcome, or if another technique such as an A/B test will be more effective.

For example, the following scenarios suggest a personalization campaign is the right course of action vs. using an A/B test:

  • The problem statements are focused on the same part of the experience, but the supporting resulting data for each is different based on particular audience segments.
  • The solutions for the problem statement are different based on particular audience segments.

The good news for teams using personalization and A/B tests is that these techniques both help foster a culture of experimentation and embrace iterative improvements to your business and customer experience.

There is no one right technique to use, however, consistently measuring experiments—whether they are conducted using personalization or A/B tests—is a must to ensure teams can glean business insights, adjust their activities when needed, and continually improve the customer experience. With this in mind, consider inviting both personalization and A/B test-driven ideas during your next business review or hypothesis workshop.

Check Out The New Martech Cube Podcast. For more such updates follow us on Google News Martech News

Whelan Boyd
Optimizely Group Product Manager
Whelan is a Group Product Manager at Optimizely, where he has focused on both customer-facing applications and the infrastructure they’re built on, as well as the company’s machine learning and automation initiatives. He oversaw the launch of Performance Edge, a client-side AB testing platform that prioritizes webpage performance by pushing key execution to serverless edge workers and reducing the size of the JavaScript returned to the browser.

Previous ArticleNext Article