Start With What We Know
“One accurate measurement is worth more than a thousand opinions” – Admiral Grace Hopper
“Almost any question can be answered cheaply, quickly and finally, by a test campaign. And that’s the way to answer them – not by arguments around a table. Go to the court of last resort – buyers of your products” – Claude Hopkins, Scientific Advertising, 1922
A/B testing gives us exact insight on what works and what does not. In their guide to controlled web experiments, the Microsoft guys coined the term HiPPO (Highest paid person’s opinion) and directed us to “listen to customers not HiPPO’s.” A memorable phrase to remind us that customers know best and are the straight shot at learning more and improving ROI. Accurate measurements and listening/learning from customers are two traits of A/B testing. Using A/B testing and knowing the do’s and don’ts can be easier said then done. There are sources like Smashing Magazine’s eBook “The Ultimate Guide to A/B Testing,” to provide insights into the world of A/B testing. We are quick to jump to the conclusion that A/B testing is amazing and easy-to-use. Rightfully so, as the Microsoft guys point out “[f]ailing fast and knowing that an idea is not as great as was previously thought helps provide necessary course adjustments so that other more successful ideas can be proposed and implemented.” I think this is where the money is. When a company is able to build an experimental system that is proven to work and is followed by employees, then that is when the company as whole is encouraged and empowered to seek improvement through change. A/B testing, when used correctly, can lead companies to “fail fast,” yet quickly think, innovate and move to the next chance to improve. Like the Microsoft guide pointed out, a proven success story of building this type of fast-moving experimental system is Amazon. Their company culture allows “data to trump intuition.”
How video on the landing page impacts conversion rates:
In the graph on the right, challenger’s D and H were both landing pages with video versus challenger J with no video at all. You can see, without question videos had higher rates. The difference between D and H was that H was an embedded video whereas D was a pop a lightbox modal pop-up video player.
If that is what we know and strive for, (a fast flowing experimental system that allows for more innovation) then we have to talk about potential hiccups. Where the Smashing Magazine’s e-book listed the do’s and don’ts of A/B testing, I think this HubSpot article by Joe Lazauskas really did a good job at hitting the 5 things to avoid/remember in A/B testing. The 5 things to avoid are:
1. Blindly optimize for conversion
This is the whole notion of making sure you have well-thought out control/treatment variables, overall evaluation criteria, test subjects, and everything else I haven’t mentioned for good practiced controlled experiments. Lazauskas points out to not be focused on “conversion without context.” Its just important to step back and remember this so that you’re on the correct path to the fast flowing experimental system that provides solid results. You can’t just A/B test everything. You have to do the planning beforehand.
2. Create the test before you set your goals
This is what I just touched upon. Basically, “when two versions of an idea pop into your head, it can be very tempting to just jump in to a test willy nilly, which leads to unclear results.”
3. Forgetting there are other letters in the alphabet
Another obvious thing to avoid, but needs to be said. I thought of it as a way to not limit your ideas to just A/B options, or rather two options. As long as the planning and goal setting are in place, then A/B/C/D…testing is okay. It’s okay to have more than one treatments/new versions.
4. Ignoring more advanced metrics
I thought this was the most important because it really shows that there needs to be more thought put into A/B testing. I liked that Lazauskas used the movie Moneyball and GM Billy Beane as an example of how to use and think of more advanced metrics. The example was how the A’s found a way to win not using traditional metrics and stats, but instead used sabermetrics. So in A/B testing, don’t always use benchmarks like page views, but also use social sharing for example. “Think about it. What would you rather have — 3,000 visitors who leave your content without doing anything, or 1,000 visitors who engage with your content?” – Lazauskas
5. Failing to separate mobile and desktop
Finally, remember to integrate A/B testing across all platforms. It’s just a really important point, but another example of how A/B testing isn’t necessarily easy-come-easy-go. There is a lot of planning to do in order to execute properly.
What I like most about A/B testing is that it can lead to success stories like previously mentioned, Amazon. With a good experimental system in place, a company can use A/B testing and quickly learn what fails and what does not in the customer’s eyes. Being able to do this process quickly without messing anything up, saves time which allows even more time for more innovation and improvements. HubSpot’s article of the list of 5, is an example of reminding us that planning before the testing is extremely important even when it may seem so obvious.