Brief

Go Big and Get Smart: Boosting Your Digital Marketing Effectiveness
en

If you’re trying to maximize the performance of your digital marketing campaigns, experimental design goes far beyond the traditional champion/challenger approach known as A/B testing. What limits A/B testing is the fact that you can test only a few variables at any one time, so you have to do many tests in order to determine the best combinations.

Experimental design massively increases the variables that digital marketers can test simultaneously—product offers, website design, merchandising, messages, incentives, interstitials displayed before or after an expected content page, calls to action, and so on. The effect of all these variables can be determined by testing just a few combinations and then using mathematical formulas to model the impact of all the possible combinations.

Experimentally designed marketing campaigns have been yielding major improvements to direct mail and other physical marketing campaigns for many years. We have seen experimentally designed campaigns increase response and buy rates by three to five times. (See our previous Bain Brief “Act now! Triple your direct marketing effectiveness,” which describes the application of experimental design to direct mail campaigns.)

This approach is even more powerful for digital marketing campaigns. Marketers can launch digital campaigns more quickly and can adjust elements such as formats, messages and offers to tailor campaigns midstream based on what they learn from customer responses. This enables them to significantly improve response rates and, more importantly, sales among the target customer segments (see Figure 1).


go-big-get-smart-fig01_embed

Complex customer bases, tough nuts to crack

Many organizations already do A/B and simple multivariate tests with automated software, usually to test simple features such as colors, images and messages on their websites. More complex experimental design is best suited to the tough, marquee marketing issues that companies are trying to address—for example, how to acquire high-value but difficult-to-attract customer groups or how to cross-sell to customers who historically have not bought multiple products. It also works best with companies that have large customer bases, such as telecommunications and cable TV providers, banks, mortgage lenders, online retailers, and credit card providers.

We recently worked with a major US communications service provider to create experimentally designed digital marketing campaigns that would improve cross-selling to existing customers who historically had not bought multiple products. We tested 17 variables—including offers, messages, merchandising, calls to action, and the timing and location of interstitial pages—in 12 different combinations. At the end of the campaign, we modeled the response rates for every possible combination of variables—486 in all (see Figure 2). The best combinations achieved 5 to 10 times the response rate and 2 to 3 times the sales conversion rate of the existing champion offers. The digital team was able to learn exactly which combinations of variables cause which segments of people to click through and buy.


go-big-get-smart-fig02_embed

From concept to results, this experimentally designed marketing test took three months, which is much faster than comparable testing through traditional channels. We were able to start tracking results within a few days of launch, and we were able to monitor response fatigue—that is, the tendency for responses to particular variables to diminish over time. Using real-time feedback, we could refresh the messages and incentives to continually lift response rates.

We also found that we could draw insights about a customer from analyzing his or her route through the website—actions such as paying the bill, streaming content, tinkering with account settings or other behavior—before a purchase.

John Senior, partner in Bain’s Asia-Pacific Customer Strategy and Marketing practice, talks about the key questions to ask when you're looking to improve your digital marketing performance.

Welcome surprises

It’s hard to predict in advance how customers will actually behave based on how they say they’ll behave or based on past behavior. Part of the power of experimental design lies in the insights gleaned from actual behavior, the predictive power of the insights and the often unexpected results such behavior reveals. In our work with the communications service provider, we found several unexpected results:

  • Incentives don’t always spur response. We expected that certain types of gift cards would spur higher response rates. But in fact, these particular gift cards had very little influence. Instead, we found that high-performing variables included two relatively soft-sell calls to action.
  • Response may be delayed. Simply showing a user a targeted pop-up ad page made that person almost three times more likely to place an order later in their online session, relative to a user who did not see the pop-up (see Figure 3).
  • Timing depends on context. The timing of when we launched an interstitial offer had much more effect than anticipated and depends on the customer’s reason for using the website—checking the account, paying a bill, browsing for new products or another purpose.
  • Offer merchandising can have a big impact. Just changing the merchandising format and presentation of an offer had a huge impact in response rates for different segments. For example, a good-, better-, best-offer format proves very effective for some segments, but a single-offer format performs better for others.

go-big-get-smart-fig03_embed

Is your organization ready?

Before using experimental design for digital marketing campaigns, it’s important to understand how your digital and physical operations work together. Online testing has many advantages over physical campaigns—including a shorter time from design to results, more flexibility in changing test parameters midstream, the ability to analyze insights in real time and a more detailed understanding of a customer’s journey—but digital testing does have particular challenges as well:

  • Collaboration among functions. Finance and Legal often need to be involved in the test design, and business units must commit to and prepare for using the outcomes that a digital marketing test generates. Business unit heads may be reluctant to change current marketing campaigns because they’re worried about downside risk. Any team undertaking experimentally designed marketing tests must prove that the offers being tested will add value and that the findings can be implemented at scale.
  • Managing technical dependencies. Depending on how your organization captures, stores and accesses customer data, you may have to tap into multiple databases and use multiple software vendors. Data extraction and matching, for instance, can be a substantial task. These dependencies need to be closely managed by the IT and digital teams.
  • Developing new or better capabilities. Besides the obvious need for expertise in statistical modeling, successful experimentally designed marketing tests hinge on developing other capabilities. One is the ability to draw up high-definition portraits of the customer segments based on actual customer characteristics held in company databases, not just on research, so that you can target more effectively.
  • Offline training and support. Launching experimentally designed digital marketing tests effectively requires preparing the physical channels to support the campaigns. Salespeople and chat agents may need new scripts to help them manage the customer queries in response to different product offers or to effectively upsell customers to the highest-value products. Other frontline staff, such as bank branch tellers, will need training on how to discuss different product tiers with different types of customers.
  • Faster decision making. Based on the financial modeling, when testing variables such as promotions and incentives and their effects on customer response rates, companies should put in place financial thresholds such as profitability targets that serve as guardrails for subsequent campaigns. These thresholds help speed up decision making and create a repeatable, efficient test-and-learn model.
  • Building a culture of testing. Many organizations find that employees will initially resist testing because it requires them to change the cadence of their work. But when testing pays off, it increases confidence and can even be fun, so managers should use successful initiatives to help build a culture of testing.

To get started with experimental design, one approach a company can take is to first identify the highest-value marketing problems it is trying to solve and selecting one of these as a pilot. This allows the company to try out experimental design campaigns, demonstrate how to deliver real results quickly and learn where the operational challenges lie. The company can then start to incorporate such campaigns into its annual marketing plan to address systematically the major opportunities in customer acquisition, development and retention. Software firms such as Optimizely and Maxymiser can help automate online tests, but marketers still need to decide what to test and must keep the organization focused on getting significant financial results. (See below, “Key questions for achieving the full potential of digital marketing.”)

With the increasing use of mobile devices and the shift to digital advertising, maximizing the return on digital advertising and marketing campaigns has become more important. It pays to move beyond testing at the margin and to focus on those things that will really move demand. By revealing how different customer segments want to search, learn, purchase and engage, experimental design can help an organization launch much higher-performing digital campaigns.

John Senior, Keri Larkin and Eric Almquist are partners with Bain & Company’s Customer Strategy & Marketing practice. They are based, respectively, in Sydney, New York and Boston.

Key questions for achieving the full potential of digital marketing

Are you: 

  • Using experimentally designed multivariate campaigns to address the tough, marquee marketing issues?
  • Pushing the boundaries on what you test rather than playing at the margins?
  • Harnessing the power of your digital channels for rapid testing and learning?
  • Seeing meaningful changes to your marketing strategies and improved ROI based on the results of your tests?
  • Deploying what you learn on a larger scale? Do you have robust test-and-learn capabilities and processes in place to allow your organization to achieve full potential with marketing?
태그

베인에 궁금하신 점이 있으신가요?

베인은 주저 없이 변화를 마주할 줄 아는 용감한 리더들과 함께합니다. 그리고, 이들의 담대한 용기는 고객사의 성공으로 이어집니다.