A/B Testing
Marketing Incrementality 101
How do advertisers uncover the true business impact of their spend?
Learn more
There’s simply no two ways about it — experimentation is the cornerstone on which high-performing companies are built.
From tech giants like Google, Netflix, or Microsoft to retail mainstays like Amazon, Home Depot, and Walmart, the companies who experiment win their market.
This is especially true in an era where sticking to tired, old formulas has led many businesses to middling results.
You’ve probably heard this one a million times:
“That’s just how we’ve been doing it for years.”
The result: Lower revenue, stagnant conversion numbers, and overall poor performance metrics.
Whether you’re looking to improve your website’s performance or see if your mobile app’s new feature jells with your customers, you’ll want to experiment to get data-rich insights you can trust and use for future changes.
In other words, you want to know what’s helping your revenue/traffic grow vs. what’s making users click out of your website or stop using your app altogether.
That’s where A/B testing comes into the mix.
In this primer, we’ll cover:
What exactly is A/B testing?
Which types of companies benefit the most from A/B testing?
Benefits of A/B testing
How can you implement A/B testing?
A/B testing (or split testing) is a straightforward and scientific way to determine what works best between two or more given options — whether it's exposing a feature to users, changing UI elements, or changing how something is worded on a page.
By pitting different variations against each other — each showcased to a random selection of your audience — you can uncover which one has the biggest impact on key performance metrics; from engagement metrics and conversion rates to real business impact like revenue and retention.
This means you don’t need to take a chance on an idea you have for improving your performance, conversion rates, retention rates, or any other metric that matters. Instead, you can run a rigorous test and use the data to decide whether or not your change was worthwhile.
E-commerce businesses — big and small — can find A/B testing extremely useful.
There’s no better example of an eCommerce company whose growth was entirely fueled by experimentation than Amazon. Take it from Jeff Bezos himself, who said “Our success at Amazon is a function of the number of experiments we run per year, per month, per day.”
Amazon runs thousands of A/B tests a year, ensuring that every single change to their website is tested — and they’ve done this for well over two decades!
By experimenting with product images, descriptions, and pricing, even a minor uplift in conversion rates can result in significant revenue boosts.
For example, imagine figuring out which product photo leads to more carts being filled, or which checkout button wording pulls more clicks. The more granular your experiments become, the easier it is to pinpoint exactly which changes yield better results.
Sometimes even tiny A/B tests can lead to massive results for eCommerce businesses:
For Airbnb, one of the most successful (and revenue-driving) experiments they ever ran took less than a single line of code to implement: opening each listing that users clicked on in a new browser tab, instead of their current tab.
This quick change drove millions of dollars of revenue growth.
For B2C software companies in industries like streaming, gaming, and sports betting, A/B testing is equally valuable for acquiring new customers as it is for the long game — upgrades, engagement, and retention.
For instance, picture a fitness app conducting A/B testing to optimize user sign-ups. In one test, new users immediately see a 30-day free trial offer. In another, they first get a tour of the app's features before seeing the trial offer.
This A/B testing approach allows the app to discern not just immediate signups but also which method leads to greater long-term engagement.
Netflix is the perfect example of a company highly dedicated to experimentation — co-CEO Reed Hastings even teaches an “AB Testing 101” course to new hires.
They famously A/B test everything in their app — from visible changes like which posters to use for movies or TV shows, to the recommendation algorithms that populate the app home screen.
They even test technical changes to their back-end like improvements to their streaming technology in the hopes of improving user experience and retention.
In the media industry, engagement is the currency, and A/B testing helps mint it.
By testing different headlines, article visuals, page arrangements, and content types, media companies can pinpoint what truly resonates with their viewers.
Major publications like the Wall Street Journal or New York Times constantly experiment with how they incentivize readers to subscribe, new features on their websites, or when to paywall content.
The ability to run hundreds or thousands of A/B tests a year has been critical to their survival as the digital landscape rendered their old business models almost obsolete.
Given media companies’ reliance on high traffic volumes, using A/B testing to optimize user experience in such a way is a no-brainer.
Here’s a quick highlight of the 3 core benefits of A/B testing:
Revenue growth: Engaging in more A/B testing contributes to higher revenue by allowing companies to fine-tune their strategies on what works best.
Risk management: A/B testing eliminates guesswork, safeguarding sales and metrics against adverse changes. You can act on a hunch without having to bet the farm.
Innovation: A/B testing encourages bold experimentation, revealing unique insights that competitors — who might be hesitant to try A/B testing — will likely miss.
Other additional benefits of A/B testing include:
Improved user experience: A/B testing allows you to tailor content and design to user preferences, ensuring a more satisfying experience.
Better content engagement: It enables you to fine-tune content, increasing user interest and activity levels.
Enhanced marketing ROI: A/B testing helps you identify the most effective (and least effective) marketing strategies, optimizing spending to maximize return on investment.
Deeper audience insights: It provides valuable insights into user behavior and preferences, allowing for more targeted strategies.
Let’s dive deeper into each of these and look at some example scenarios where A/B testing shows its true value.
Through A/B testing, businesses can identify which variations of their web pages, new product features, or even ad campaigns lead to higher revenue numbers.
This empirical approach to optimization means decisions are data-driven, directly contributing to better financial outcomes.
Example: Imagine an e-commerce site testing two product pages — one with big pictures and little text, and the other with detailed descriptions and smaller images. If one leads to more sales and longer visits, that's the winner, and it will guide the site's future designs.
No need to start making changes in the hopes that something sticks. A/B testing takes the guesswork out of the equation and helps prevent the negative impact on sales and other key metrics associated with making the wrong call.
Example: Before a major website overhaul, a company could test key new features on a segment of its audience. Insights gained from how these features are received help ensure that the full rollout will improve user satisfaction without alienating existing customers.
On the other hand, failing to A/B test a sweeping website redesign could have disastrous results:
In 2011, Target tried to launch a new version of Target.com without A/B testing it, and the outcome was so detrimental that then-CEO Gregg Steinhafel had to offer a mea culpa on two consecutive earnings calls.
First in the Q3 2011 earnings call, where he acknowledged that the redesign/relaunch would hurt their Black Friday potentials. Then in Q4 2011:
"Well, it [the website relaunch] hurt. It hurt the comp in the quarter, and the primary timeframe where it hurt the most was really in the November, first couple of weeks of December timeframe. And as we have added fixes to the website and maintained the stability and worked on things like site navigation, speed, page loading, waiting, and the overall experience, we have seen our business on the .com site continue to get better.
Our traffic on the site continues to be very, very good. So we're very encouraged about the fact that the guests still love coming to the website. What we were disappointed in was the experience once they got there, and so our conversion rates were not to where they had been in the past, and that aspect is what we've seen an improvement on over the last 6 or 8 weeks.”
A/B testing empowers businesses to foster a culture of innovation by safely experimenting with ideas in safe environments. This approach leads companies to discover unique ways to engage their customers in ways that set them apart from competitors — especially those not embracing experimentation.
Example: A gaming app tests two user interfaces: A minimalist design versus a rich, interactive one. A/B testing reveals which UI delights players more, possibly pioneering a design that sets the app apart and attracts a dedicated user base.
Intuit founder Scott Cook makes the case excellently:
“The bigger and more novel the idea, the less likely it is to survive the gauntlet of bosses who must all agree — bosses who are most comfortable with what they know and only know the past. I don’t think it’s an accident that when software companies grow large, they have until recently become less and less innovative — think Microsoft or IBM or others.
What I’ve seen at some firms I admire is something quite different. I call it decision by experiment.
I wondered why Google beat Yahoo! at search. A Yahoo! executive told me that Google succeeded by installing the system and culture to decentralize decision-making to decision by experiment.
Google’s chief economist said that Google runs 3,000 to 5,000 experiments a year in search — when you use Google you’re in those experiments.”
A/B testing is key for companies that need to shed some light on user preferences in real time. By comparing how different layouts, content, and features perform, businesses can mold their online presence to better suit their target audience through an enhanced user experience.
Example: Duolingo leverages A/B testing to experiment with their user experience, drawing from billions of monthly data points. Their mission to offer free language education worldwide is supported by strategic A/B tests designed to boost user motivation and engagement.
By employing game design principles — such as setting small goals, showcasing progress, and incentivizing daily use — Duolingo boosts user motivation.
The "streak" feature is central to this approach, rewarding users for consecutive days of app use and significantly impacting user engagement.
Here are five A/B tests conducted by Duolingo that led to an increase in user engagement:
Making streaks visible: Placing streak counts prominently in the app led to a 3% increase in daily active users and a 1% increase in two-week retention.
Emphasizing streaks post-lesson: Showing streak achievements after lessons improved daily and two-week retention by 1% and 3%, respectively.
Utilizing external triggers: Timely reminders about streaks encouraged re-engagement, with emails sent 23.5 hours after the last lesson being particularly effective.
Encouraging user investment: Introducing virtual currency wagers on streaks boosted two-week retention by 5% and in-app purchase revenue by 600%.
Designing for downtimes: The weekend amulet, allows users to maintain streaks without daily play, increased retention, and reduced streak loss.
Through ongoing A/B tests with varied content, businesses can pinpoint exactly what their audience loves.
The result is a huge boost to engagement, as content tailored to user interests keeps them hooked and coming back for more, crafting a cycle of increased interaction and loyalty.
Example: A news outlet might test two different headline styles for their articles to see which generates more clicks and shares. The style leading to better engagement metrics informs the outlet's editorial strategy, ensuring future content is both appealing and effective.
You know that classic saying “Half the money I spend on advertising is wasted; the trouble is I don’t know which half”?
Experiments offer a way to prove which half of your campaigns are working, and which half is waste. You can run A/B tests turning different campaigns on and off in strategic ways to understand what truly drives incremental impact, and what is just throwing money away.
By channeling efforts into what truly resonates, companies can boost their marketing ROI, turning smart targeting into bigger gains.
Another example: An online retailer might test two different ad creatives on social media to see which drives more traffic to their site. The more successful ad not only reduces cost per click but also increases overall campaign effectiveness, optimizing marketing spend.
A/B testing is more than a choice between A or B. It's a deep dive into what drives your customers. These data-driven insights let companies craft marketing, products, and experiences that hit home, making every interaction feel tailor-made to their exact target audience.
Example: A streaming service could A/B test two different recommendation algorithms to see which leads to longer viewing times. The successful algorithm provides insights into user preferences, helping to refine future recommendations and enhance viewer satisfaction.
Now that you know that an A/B testing strategy is definitely worth it, it’s time to get into the hard part: Implementation.
Does it need to be hard, though?
With Eppo, implementing A/B tests becomes a breeze, enabling businesses to easily deploy, monitor, and manage experiments without a Ph.D. in data science.
Unlike traditional A/B testing tools that can only measure surface-level metrics and often get unreliable results, Eppo simplifies the process, allowing for trustworthy data and intuitive results that drive right to the heart of what matters: Key business outcomes like revenue, retention, or margins.
Here’s how Eppo helps you hit the ground running:
Starting your journey with Eppo begins with a quick demo. Once registered, our warehouse-native platform seamlessly integrates with your existing data warehouses such as Snowflake, Databrick, Redshift, BigQuery, and more.
After that, use Eppo’s SDKs to actually serve A/B tests to your users and allow your team to start experimenting right away.
Once you’ve gotten Eppo set up, you’ll be able to make use of its entire suite of out-of-the-box features:
Planning and goal setting: Craft your A/B testing strategy with clarity, from defining your hypotheses, to sample size planning, to organizing key business metrics to make sure every test is aligned and impactful.
Creating and managing variations: Eppo’s end-to-end experimentation platform combines world-class Feature Flagging with all the bells and whistles that enable advanced test designs - from different environments and waterfall allocations to mutually-exclusive layers and holdouts
Targeting and segmentation: Leverage sophisticated segmentation based on literally any user data available in your warehouse to ensure experiments are targeted at specific user groups.
Launching tests: Diagnostics and Real-time Metrics features confirm your tests launch correctly, spotting any potential issues early on before time and effort is wasted.
Analyzing results: Eppo’s best-in-class statistical engine offers you trustworthy, rigorous, and easily understood results, turning complex data into actionable insights. You can also create slice-and-dice explorations to dive deeper into results on the fly for even more insights.