We have French content available. View French site.

Etude

Why we behave—and decide—the way we do

Why we behave—and decide—the way we do

Here are four biases that can hobble even the most thoughtful decision makers.

  • min

Etude

Why we behave—and decide—the way we do
en

Organizational ailments, such as too much complexity, often interfere with good business decision making and execution. But they aren’t the only source of trouble. Even in the best of circumstances, people must ultimately make and execute decisions, and we human beings are even more complicated than a tangled org chart or a messy decision process. We are prisoners of emotions, habits and biases. We choose A rather than B for reasons that we often don’t understand. These pitfalls can ensnare individuals who are making decisions; they can also cause groups to go astray.

The good news is that psychologists and behavioral economists have been studying why people decide the way they do. In this article we’ll look at individual behaviors, highlighting just four of the many obstacles that these scholars have identified. If you’re aware of the traps, you are far less likely to be snared by them—and your decisions and actions will be that much better.

1. Fairness. It’s a familiar story, known to behavioral economists as a version of the “ultimatum game.” A bored rich lady sits between two strangers—call them Robert and Juliette—on a plane. For entertainment, she offers to give Robert $10,000, with the proviso that he must make a one-time binding offer to give some of it to Juliette. If Juliette accepts Robert’s proposed split, they divide the money accordingly. If she rejects it, the rich lady keeps her money, and Robert and Juliette get nothing.

So how much does Robert offer? In theory, he could offer Juliette only $10. A rational person would accept it because it was, after all, free money. In practice—and the experiment has been conducted repeatedly—people in the Juliette role regularly reject any offer that they deem unfair. A powerful moral principle, fairness, plays a big role in decision making, often stronger even than self-interest.

You can see this phenomenon in business as well: any decision that people regard as unfair, such as paying bonuses to executives while laying off lower-level employees, is likely to trigger a sharp reaction.

2. Confirmation bias. This is a version of what psychologists sometimes call “motivated reasoning”—we seek out and believe information that confirms our opinions, while ignoring or downplaying information that contradicts them.

Many experiments have confirmed this tendency. A few years ago, for instance, Drew Westen and colleagues at Emory University in Atlanta recruited 15 Republicans and 15 Democrats and presented them with contradictory behaviors from the two major candidates in the 2004 US presidential election, along with statements designed to explain the contradictions. For example, George W. Bush once said he “loved” Enron CEO Ken Lay, but after Enron’s collapse he was critical of the company and avoided any mention of Lay. The explanation was that he felt betrayed by Lay and was shocked to learn of Enron’s corruption. Each set of partisans tended to believe the explanatory statements for their own candidate, while regarding the statements by the opposition candidate as inconsistent.

In big decisions, individuals can easily fall into confirmation bias, jeopardizing the possibility of reaching the best outcome.

3. Framing and anchoring. Every decision depends on information. The structure and reference points of that information shape how the decision maker receives and uses it. Chief executives contemplating an acquisition, for instance, often frame the question as “Why should we do this deal?” and then answer it by focusing on potential but often illusory synergies. If they frame it instead as “How much should we be willing to pay?” the decision can turn out quite differently.

Anchoring—using a predetermined reference point as the launch pad for a decision—is equally powerful. A few years ago, for instance, Wharton School professor Paul J. H. Schoemaker was studying bad loans at a bank in the southern US. He found that bank officers assessing a loan naturally began by determining its current rating and asking themselves whether they should upgrade or downgrade it. Because of the anchoring effect of the current rating, as a report on Schoemaker’s work noted, downgrades tended to be incremental adjustments. So “by the time a loan was classified as troubled, it could be too late to take remedial action.”

Bain Book

Decide & Deliver

Learn more about the five steps that leading organizations use to make great decisions quickly and execute them effectively.

4. Overconfidence. People everywhere tend to see their own abilities in an unrealistically positive light. Some 93% of US drivers famously say they are better than average. Countless sales managers regularly predict double-digit annual gains, especially in the out years, hence the prevalence of hockey-stick forecasts.

Overconfidence often leads to terrible decisions, and not just in business. Consider the invasion of Gallipoli in 1915, which British officers thought would be an easy victory. “Let me bring my lads face to face with Turks in the open field,” wrote Commander Sir Ian Hamilton in his diary. “We must beat them every time because British volunteer soldiers are superior individuals to Anatolians, Syrians or Arabs....” The British were decisively defeated at Gallipoli, notes Malcolm Gladwell in the New Yorker, partly because of such overconfidence.

Analyze any bad decision and you are likely to find more than one of these biases at work, each reinforcing the others. Consider the tragic 1986 decision to launch the space shuttle Challenger in spite of unusually cold weather. Confirmation bias? NASA determined that previous flights had been successful, even though the seals on the solid rocket booster showed unexplained signs of erosion. Overconfidence? Management estimated that the chances of shuttle failure were as little as 1 in 100,000—low enough, as the late physicist Richard Feynman pointed out, to “imply that one could put up a shuttle every day for 300 years expecting to lose only one.” As for framing, Jim Collins, in How the Mighty Fall, notes that the crucial go/no-go decision in the Challenger situation was framed as, “Can you prove it’s unsafe to launch?” Reversing the framing—“Can you prove it’s safe to launch?”—might have led to a different decision.

What to do about decision bias? It helps, of course, to be on the lookout for its sources, and to try to compensate accordingly. Organizations can also create robust decision processes that acknowledge and address the biases. They can frame questions in such a way as to pressure-test assumptions. They can explicitly assign the role of devil’s advocate, or even create a “red team, blue team” debate so that both sides of a major issue are fully represented. Of course, the human brain is more complex than any organization, and people will doubtless continue to cling to their biases. But robust countermeasures can at least minimize the likelihood that biases will lead to poor decisions.

Paul Rogers is a partner with Bain & Company in London and leads Bain’s Global Organization practice. Robert Carse and Todd Senturia are Bain partners based in London and Los Angeles, respectively.

Mots clés

Vous souhaitez continuer cette conversation ?

Nous aidons des dirigeants du monde entier à matérialiser des impacts et des résultats pérennes et créateurs de valeur dans leurs organisations.