We have limited Spanish content available. View Spanish content.

Brief

Act now! Triple your direct marketing effectiveness

Act now! Triple your direct marketing effectiveness

Leading marketers now have a far more powerful and sophisticated technique to test direct marketing campaigns.

  • min read

Brief

Act now! Triple your direct marketing effectiveness
en

We all have email inboxes littered with subject lines like the headline above. If they inspire any action, it’s usually to hit the Delete key.

Hyperbole and exclamation marks aren't going to break through the clutter. But thanks to recent advances in statistical methods, modeling and analytics, leading marketers now have a far more powerful and sophisticated technique called experimental design.

The method allows marketers to increase exponentially the variables tested in a single campaign (product offers, messages, incentives, mail formats and so on) and to test multiple offers in the market simultaneously. Marketers learn exactly which variables entice consumers to act. As a result, response rates rise dramatically, the effectiveness of future campaigns improves and overall return on spending increases.

Companies spend huge sums on direct marketing to acquire or retain customers through email, direct mail, catalogs and other tactics. Yet the return on investment (ROI) is often poor because the response rates tend to be very low—usually less than 5% and often less than 0.5%—and rates have been declining.

Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. Eric Almquist, a partner in Bain's Customer Strategy & Marketing practice, discusses how this technique can help companies learn more about their customers, faster.

Many organizations still use the traditional champion-challenger approach, also called A/B testing, to test new offers, because it’s relatively easy to execute and can be scaled up quickly. However, it has severe limitations: Only a few offers can be tested at a time, the variance is low so regression results are not meaningful, and one cannot identify which individual variables cause consumers to respond.

Experimental design offers a better approach. In Bain & Company’s work with clients, we’ve seen experimental design-based multivariate marketing campaigns increase consumer response rates by three to eight times, adding hundreds of millions of dollars to the top and bottom lines.

Why test one thing when you can test many things?

Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. It lets marketers project the impact of many variables by testing just a few of them. Mathematical formulas use a subset of combinations of variables to represent the complexity of all the original variables. That allows the marketing organization to more quickly adjust messages and offers and, based on the responses, to improve marketing effectiveness and the company’s overall economics (see Figure 1).


act-now-triple-your-direct-marketing-fig-01_embed

Isolating the behavioral effect of each variable serves to confirm or refute the marketer’s hypothesis or the rules of thumb often used to support decisions. “$200 cash back” might seem like a winning promotion, but response rates will vary depending on the customer segment targeted, how the incentive is physically presented and other factors.

This approach differs from data mining because it is based on forward-looking experimentation. To really know what combination of messages, offers and incentives will work, you must define and control the variables before putting them into the marketplace, rather than mining data after the fact. And because you control the introduction of variables, you can attribute differences in response to the specific variables in question with great precision. In other words, experimental design reveals whether variables caused a certain behavior as opposed to simply being correlated with that behavior.

A cable company’s experience

Experimental design can be used most effectively by organizations that market directly to a large number of customers, including telecoms, cable and media firms, banks, insurance companies, online retailers and credit card providers. Let’s look at how it improved the performance of direct marketing campaigns at a major cable company.

Faced with maturing and increasingly competitive markets, the company wanted to grow its share of high-value customers through several means: accelerating acquisition of those customers, including getting them to switch from competitors; reducing the defection of existing customers; and upgrading current customers to higher-value products. Direct marketing would be a central means of accomplishing each of these goals.

Past direct marketing campaigns were based on traditional champion-challenger tests. These had become less effective and more expensive because the cable company would mail to several million households every quarter, but response and conversion rates were declining. It decided it needed a new approach and worked with Bain to develop and launch an experimental design-based multivariate campaign.

Together, we created a direct marketing campaign to test 18 variables, including different formats, promotions and messages and then launched 32 marketing offers simultaneously to the target customer segment. Mathematical formulas allowed us to model response rates for every possible combination of variables—576 in all—even combinations that were not launched in the market. The best offers achieved three to four times the response rate of the existing champion offer, and the marketing organization learned which variables caused consumers to respond.

The test uncovered several unexpected results (see Figure 2). For example, the richest offers, including expensive equipment, were hypothesized to spur the highest response rates but actually performed worse than other offers that were less expensive. Instead, the factors that sparked the highest response rates included the packaging of the incentive, the content of the message and the format of the mailing. The campaign also converted a much higher proportion of customers to high-value packages than had previously been achieved, which increased average revenue per user, a key metric in the industry, by 20%.


act-now-triple-your-direct-marketing-fig-02_embed

The company has incorporated the best-performing tests into national campaigns with significant results: millions of dollars to the top and bottom lines. The organization has also built in-house capabilities and is now using experimental design to create new campaigns targeting different customer segments.

Organizing for results

Companies don’t simply run one test and instantly reap the rewards—technology alone cannot make marketing more effective. The experiences of companies we’ve worked with highlight which areas in the organization typically merit upgrading.

Capabilities. Most obvious are capabilities in experimental design and statistical modeling, to structure effective multivariate marketing campaigns and interpret the results.

Besides statistical capabilities, success with experimental design entails a deep understanding of customers’ priorities and the ability to draw up meaningful customer segments based on needs and behavior rather than on straight demographics, location or income. That allows one to develop relevant messages, offers and incentives.

At the cable company, for instance, one segment consisted of families who value technology that allows them to watch TV in any room. Targeting this segment with messages and offers about that technology improves response rates because it resonates with their needs. But for a different segment within young households that values simplicity and price, technology messages would be far less effective.

Companies can also take advantage of a body of useful research in behavioral economics. The teaser message “don’t miss out” derives some of its power from the phenomenon of loss aversion—people’s tendencies to strongly prefer avoiding losses vs. acquiring gains.

Some companies may have to expand their customer data sets to include, say, data captured at the point of sale. And any company will need to perform robust financial modeling of alternative product offers to ensure that all the offers are profitable.

Processes and training. Experimental design often involves process changes to efficiently launch multivariate tests and roll out the resulting insights into subsequent campaigns, as they are more complex to manage than champion-challenger tests. New scripts and training for salespeople and call center agents may be warranted to manage the customer calls in response to different offers, to effectively upsell customers to the highest-value products and to deliver a high-quality customer experience.

Decision making. Based on financial modeling, senior managers should put in place financial thresholds, such as profitability targets, to provide guidelines for subsequent marketing campaigns. These thresholds will help speed up decision making and create a repeatable, efficient test-and-learn model.

Proficiency in each of these areas augments the practical application of the science (see sidebar). As the head of sales and marketing at one media and communications company says: “The value in experimental design is 20% core science, 80% process and change management and the tools Bain is leaving with us.”

All channels open

With the rapid spread of mobile devices and social networks, marketers have more communication alternatives than ever before. That makes for greater opportunities in direct marketing—if companies can uncover which attributes of the campaign actually influence customer behavior. More communication alternatives also make for greater complexity, and the onus is on companies to make their many channels work seamlessly together, so that customers get a consistent brand experience from one channel to another.

Experimental design can be used in any type of digital or physical campaign that reaches large numbers of customers or prospects. Now it’s possible to tease out how each marketing channel interacts meaningfully with another variable, such as a free sample or a coupon offered through an email campaign. By harnessing the power of massive variance, experimental design matches exactly the right offer with the right customer, giving a healthy boost to the top and bottom lines.

Lessons learned from multivariate campaigns

  • Predictive modeling, classification models and behavior -based customer segmentation help target the right customers.
  • The richest offers don’t necessarily cause the highest response rates; signals of good value matter more.
  • The effect of some offer attributes differ depending on competitors’ positions in the marketplace.
  • Meticulous test monitoring ensures that response data is captured accurately.
  • Offer fatigue lowers response rates; performance improves as a result of rotating offers.
  • The financial effects of promising product offers should be estimated to ensure that the offers create value instead of destroying it.
  • Salespeople and call-center agents may need new scripts and training in the wake of the refined campaign.
Tags

Ready to talk?

We work with ambitious leaders who want to define the future, not hide from it. Together, we achieve extraordinary outcomes.