Experimentation in marketing is almost a century old, and there have been some world changing examples along the way. Pepsodent toothpaste got the ball rolling by testing the effectiveness of magazine and newspaper ads in the 1920s using nothing more than coupon copy. Predictably, it took a science-based company to pioneer these first A/B split tests and Google use exactly the same principles today to optimise AdWords. And how can anyone overlook the impact that the first Pepsi Challenge taste test had in 1975?
A/B testing is the gold standard of experimentation. It is meant to help companies make faster, better, data-driven decisions. But too often, it does the opposite. The meeting starts with optimism: a new pricing idea, ad layout, or signup screen goes into an A/B test. After waiting for weeks, analysts come back with p-values, 95% confidence thresholds, and a familiar conclusion: "We should wait for more data. We don't have enough evidence yet, and it's not statistically significant."
Last year, a study found that cars are steadily getting less colourful. In the US, around 80% of cars are now black, white, gray, or silver, up from 60% in 2004. This trend has been attributed to cost savings and consumer preferences. Whatever the reasons, the result is hard to deny: a big part of daily life isn't as colourful as it used to be.
SMB marketers in the B2B sector say conversion rate optimization (CRO) is their top A/B testing priority in 2025, but a new report found most aren't running A/B tests to improve CRO. The " State of A/B Testing Report " from Unbounce found 56% of SMB marketers in B2B cite CRO as their top testing goal. However, only 32% are running A/B tests on their landing pages.