This is not a blog post telling you how to run a perfect A/B testing campaign. This is a blog for the stretched marketer: with not enough time, money or resources at their fingertips. Inspired by Melanie Kyrklund, who has run multiple A/B testing campaigns at Staples, we offer you some useful tips for getting your A/B testing off the ground.
1. I don’t have enough resources to run all the A/B tests that I want to. What should I do?
The inspiration you might feel from a blog or webinar about the power of A/B testing can be easily quenched when the reality of your budget and your team size hits home. This is a disillusioning feeling. The best answer to your question lies in efficiency. Instead of using multiple technologies for which you need a separate budget, why not consider an integrated platform that can test, deploy and scale across your website? This minimizes the amount of different technologies used and makes the process far smoother. The more A/B testing can prove its results, the faster the cost of implementing it can be justified. The more integrated it is, the more effective the process will become.
2. Other members of management don’t understand what A/B testing is and how it works. It’s hard to get them to dedicate budget to this.
This is quite a common problem for ecommerce and marketing managers everywhere. A/B testing can seem like quite an abstract concept, divorced from the way that the rest of the company works. In reality, for many organizations, testing represents not just a practical problem of budgeting but also a cultural and political shift which comes with its own barriers. It’s key to have an open conversation with your company at large about its benefits as soon as possible.
In fact, A/B testing is not just a tool for ensuring that you can create a better experience online, it can also be a great way to share insights with other parts of the business. Sharing insights has the potential to affect your offline presence. Imagine, for example, that you know that people are buying more of one T-shirt in one specific color than another. This information can then be used to influence in-store stock levels, and even decide what your mannequins are wearing. Go to the rest of the company with concrete examples of wins, impress them and show them the research. This should convert them from A/B skeptics to A/B lovers.
3. I understand that I need a rapid pace of testing to maintain results. But keeping on top of this can be hard, especially when the tests need to be approved.
Yes, you are right: maintaining a rapid pace is important. For larger sites, a few tests a month simply won’t generate enough results, even using the correct methodology. Manage this process by building a test pipeline log, which includes your idea, hypothesis, and projected incremental revenue. Prioritizing weekly is key here as this will help you to keep up the momentum of testing and allow you to focus on big wins. However, this can be tricky when timings are involved: sometimes you have less time for analysis. A great way around this problem is to implement a simple UX test. For example, running a test to detemine whether people respond better to lists or grid view on your search results is most effective. This will take a shorter amount of time and will keep your testing program moving, even though it is not the most data driven.
Also, ensure that you have your priorities in place. If you're operating across multiple domains or countries, always prioritize your highest traffic and revenue markets first. But try and gain insights at a local level. Only then branch out to similar markets that could use the same hypotheses. This will ensure that you are focusing on the biggest potential wins as a priority.
4. Coming up with good hypotheses that are specific enough can be hard. How do I do this on a consistent basis?
To be honest, this is, partly, a question of practice. The more you do, the easier it will become to come up with good hypotheses. But to help you progress, I would really recommend you watch this video of our Data Scientist, Will Browne, as he gives you 5 top tips to A/B test. Here he gives insight into which sorts of hypotheses work and which don’t. Definitely a must-see.
Instilling a data-driven culture can be a feat, but the results will pay off quite dramatically.