ROTMAN Magazine
Our ‘WRAP’ model can help you
counteract four common thinking biases.
by Chip
Heath and Dan Heath
IF
YOU SPEND SOME TIME studying the kinds of decisions people make and the outcomes
of those decisions, you’ll find that humanity does not have a particularly
impressive track record. Career choices
are often abandoned or regretted; business decisions are frequently flawed; and
on the personal front, we’re not doing much better: we don’t save enough for
retirement; we let work interfere with our family life; and 88 per cent of New
Year’s resolutions are broken. Why do we
have such a hard time making good choices?
The
most widely-used decision making process is the pros and-cons list, whereby you
mentally or physically list the positive and negative attributes of a
particular choice and tally them up. The
advantage of this approach is that it’s deliberative: rather than jump to
conclusions, you hunt for both positive and negative factors—pushing your
mental spotlight around until you feel ready to make a decision.
But
it’s not enough. Research in Psychology over the last 40 years has identified a
broad set of biases in our thinking that doom our decision making. If we aspire
to make better choices, we must learn how these biases work and how to fight them.
This article describes the four most pernicious villains of decision making and
suggests some first steps you can take to overcome them.
Villain
#1: Narrow Framing
“Any
time in life you’re tempted to think, ‘Should I do this OR that?’, instead, ask
yourself, ‘Is there a way to do this AND that?’
Surprisingly frequently, it’s feasible to do both things.” These are the
words of Steve Cole, the VP of Research and Development at HopeLab,
a
non-profit that to fights to improve kids’ health using technology. For one
major project, Cole and his team wanted to find a design firm that could help
them design a portable device capable of measuring the amount of exercise kids
were getting.
There
were at least seven or eight local design firms capable of doing the work. In a
typical contracting situation, HopeLab would have solicited a proposal from
each and then given the winner a giant contract. But instead of choosing a
winner, Cole
decided to run a ‘horse race’: he shrunk down the scope of the work so that it
covered only the first step of the project, and then he hired five different
firms to work on the first step independently.
To be clear, he wasn’t quintupling his budget — as a non-profit, HopeLab
didn’t have unlimited resources. Cole knew that what he’d learn from the first
round would make the later rounds more efficient.
With
this approach, Cole ensured that he would have multiple design alternatives for
the device. He could either pick his favourite or combine the best features of
several. Then, in Round 2 of the design, he could weed out any vendors who were
unresponsive or ineffective. By taking this non-traditional approach, he was
fighting the first villain of decision-making.
Narrow
framing is the common tendency we have to define our choices too
narrowly, to see them in binary terms. We ask, “Should I break up with my
partner or not?” instead of, “What are the ways I could make this relationship
better?” Too often, we get stuck in a narrow frame that spotlights one
alternative at the expense of all the others. Cole, with his approach, broke
out of that trap.
It
wasn’t an obvious move; he had to fight for the concept internally. “At first,
my colleagues thought I was insane. At the beginning, it costs some money and
takes some time. But now everybody here does it. You get to know lots of
different things about the industry. You get convergence on some issues, so you
know they are right, and you also learn to appreciate what makes the firms
different and special. None of this can occur if you’re just talking to one
person. And when five firms know that there are four other shops involved, they
always bring their best game.”
Notice
the contrast with the Pros and Cons approach. Cole could have tallied up the
advantages and disadvantages of working with each vendor, and then used that
analysis to make a decision. But that
would have reflected narrow framing. Implicitly, he would be assuming that
there was one vendor who was uniquely capable of crafting the perfect solution,
and that he could identify that vendor on the basis of a proposal.
There’s
a more subtle factor involved, too: in meeting with the teams, Cole would have
inevitably developed a favourite, a team he ‘clicked’ with. And though
intellectually he might realize that the people he likes personally aren’t
necessarily the ones who will build the best products, he would be tempted to
jigger the Pros and Cons list in their favour. Cole might not even be aware he
was doing it, but because Pros and Cons are generated in our heads, it is very
easy to bias the factors. We think we are conducting a sober comparison but, in
reality, our brains are following orders from our guts.
Villain
#2: The Confirmation Bias
Our
normal habit in life is to develop a quick belief about a situation and then
seek out information that bolsters that belief. This problematic habit, called
the confirmation bias, is
the second villain of decision-making.
Here’s
a typical result from a large literature on the topic: Smokers in the 1960s —
back when the medical research on the harms of smoking was less clear — were
more likely to express interest in reading an article headlined “Smoking Does
Not Lead to Lung Cancer” than one with the headline “Smoking Leads to Lung
Cancer.” To see how this could lead to bad decisions, imagine your boss staring
at two research studies headlined “Data that Supports What You Think” and “Data
that Contradicts What You Think.” Guess which one will get cited at the staff
meeting?
Researchers
have found this result again and again: when people have the opportunity to
collect information from the world, they are more likely to select information
that supports their pre-existing attitudes, beliefs and actions. Political
partisans seek out media outlets that support their side but will rarely challenge
their beliefs by seeking out the other side’s perspective. Consumers who covet new cars or computers
will look for reasons to justify the purchase but won’t be as diligent about
finding reasons to postpone it.
The
tricky thing about the confirmation bias is that it can look very scientific.
After all, we’re collecting data. Dan
Lovallo, a professor and decision-making researcher, has said,
“Confirmation bias is probably the single biggest problem in business, because
even the most sophisticated people get it wrong. People go out and collect
data, and they don’t even realize they’re cooking the books.”
At work
and in life, we often pretend that we want the truth when we’re actually
seeking reassurance: “Do these jeans make me look fat?” “What did you think of
my poem?” These questions do not crave honest answers. Pity the poor
contestants who try out to sing on reality TV shows, despite having no
discernible ability to carry a tune. When they get harsh feedback from the judges,
they look shocked. And you realize: this is the first time in their lives
they’ve received honest feedback. Eager for reassurance, they’d locked their
spotlights on the praise and support they received from friends and family.
Given that affirmation, it’s not hard to see why they’d think they had a chance
to become the next American Idol. It was a reasonable
conclusion — drawn from a wildly-distorted pool of data.
This is
what is slightly terrifying about the confirmation bias: when we want something
to be true, we will spotlight the things that support it, and then, when we
draw conclusions from those spotlighted scenes, we’ll congratulate ourselves on
a reasoned decision. Oops.
In his
memoir, Only the Paranoid Survive,
Andy
Grove recalled a tough dilemma he faced in 1985, as the president of
Intel:
whether to kill the company’s line of memory chips. Intel’s business had been
built on memories. For a time, in fact, the company was the world’s only source
of memory, but by the end of the 1970s, a dozen or so competitors had emerged.
Meanwhile, a small team at Intel had developed another product, the microprocessor,
and in 1981, the team got a big break when IBM chose
Intel’s microprocessor to be the ‘brain’ of its new personal computer. Intel’s
team scrambled to build the manufacturing capacity they’d need to produce the
chips.
At that
point, Intel became a company with two products: memory and microprocessors.
Memory was still the dominant source of revenue, but in the early 1980s, the
company’s competitive position in the memory business came under threat from Japanese
companies. “People who came back from visits to Japan told scary stories,” said
Grove. It was reported that one Japanese company was designing multiple
generations of memory all at once: the 16K people were on one floor, the 64K
people were a floor above, and the 256K team was above them.
Intel’s
customers began to rave about the quality of the Japanese memories. “The
quality levels attributed to Japanese memories were beyond what we thought possible,”
said Grove. “Our first reaction was denial: this had to be wrong. As people
often do in this kind of situation, we vigorously attacked the data. Only when
we confirmed for ourselves that the claims were roughly right did we start to
go to work on the quality of our product. We were clearly behind.”
Between
1978 and 1988, the market share held by Japanese companies doubled, from 30 to
60 per cent. A debate raged inside Intel about how to respond to the Japanese
competition. One camp of leaders wanted
to leapfrog the Japanese in manufacturing.
They proposed building a giant new factory to make memory chips. Another
camp wanted to bet on an avant-garde new technology that they thought the
Japanese couldn’t match. A third camp wanted to double-down on the company’s
strategy of serving specialty markets.
As the
debate continued, the company began losing more and more money. The
microprocessor business was growing rapidly, but Intel’s failures in memory
were becoming a drag on profits. Grove
summarized the year 1984 by saying, “It was a grim and frustrating year. During
that time, we worked hard without a clear notion of how things were ever going
to get better. We had lost our bearings.”
In the
middle of 1985, after more months of fruitless debate, Grove was discussing the
memory quandary in his office with Intel’s chairman and CEO, Gordon
Moore. They were both fatigued by the internal deliberations. Then
Grove had an inspiration:
I
looked out the window at the Ferris Wheel of the Great America amusement park
revolving in the distance, then I turned back to Gordon and I asked, ‘If we got
kicked out and the board brought in a new CEO, what do you think he would do?’
Gordon answered without hesitation, ‘He would get us out of memories.’ I stared
at him, numb, then said, ‘Why shouldn’t you and I walk out the door, come back
in, and do it ourselves?’
This
was the moment of clarity. From the perspective of an outsider, someone not
encumbered by the historical legacy and the political infighting, shutting down
the memory business was the obvious thing to do. The switch in perspectives —
“what would our successors do?” — helped Moore and Grove see the big picture
clearly.
Of
course, abandoning memory wasn’t easy. Many of Grove’s colleagues were furiously
opposed to the idea. Some held that memories were the seedbed of Intel’s
technology expertise, and that without them, other areas of research were likely
to wither. Others insisted that Intel’s sales force could not get customers’
attention without selling a full range of products — memories as well as
microprocessors.
After
much gnashing of teeth, Grove insisted that the sales force tell their
customers that Intel would no longer be carrying memory products. The
customers’ reaction was, essentially, a big yawn. One said, “It sure took you a
long time.” Since that decision, in 1985, Intel has dominated the
microprocessor market. If, on the day of Grove’s insight, you had invested $1,000
in Intel, by 2012 your investment would have been worth $47,000 (compared with
$7,600 for the S&P 500, a composite of other big companies). It seems safe
to say that he made the right decision.
Grove’s
story reveals a flaw in the way many experts think about decisions. If you
review the research literature, you’ll find that many decision-making models
are basically glorified spreadsheets. If you are shopping for an apartment, for
instance, you might be advised to list the eight apartments you found, rank them
on a number of key factors (cost, location, size, etc.), assign a weighting
that reflects the importance of each factor (cost is more important than size),
and then do the math to find the answer (Um, move back in with Mom and Dad.)
There’s
one critical ingredient missing from this kind of analysis: Emotion. Grove’s
decision wasn’t difficult because he lacked options or information, it was
difficult because he felt conflicted. The short-term pressures and political
wrangling clouded his mind and obscured the long-term need to exit the memory
business.
Villain
#3: Short-Term Emotion
This
brings us to the third villain of decision-making: short-term
emotion. When we’ve got a difficult decision to make, our feelings
churn. We replay the same arguments in our head. We agonize about our circumstances. We change
our minds from day to day. If our decision were represented on a spreadsheet, none
of the numbers would be changing — there’s no new information being added — but
it doesn’t feel that way in our heads. We have kicked up so much dust that we
can’t see the way
forward. In these moments, we need an infusion of perspective.
Ben
Franklin was aware of the effects of temporary emotion. His ‘moral algebra’ wisely suggests that
people add to their Pros and Cons list over several days, giving people a
chance to add factors as they grow more or less excited about a particular idea.
Still, though, to compare options rigorously is not synonymous with seeing the
bigger picture. No doubt Andy Grove had been compiling his Pros and Cons list
about whether to exit the memory business for many years. But the analysis left
him paralyzed, and it took a quick dose of detachment — seeing things from the perspective
of his successor — to break the paralysis.
Villain
#4: Overconfidence
Our
search for the final villain of decision-making takes us back to January 1,
1962, when a young four-man rock ‘n’ roll group named the Beatles was invited
to audition in London for one of the two major British record labels, Decca
Records. “We were all excited,” said John
Lennon. “It was Decca.” During an hour-long audition, they played
different songs, mostly covers. The Beatles and their manager, Brian
Epstein, were hopeful they’d get a contract, and they waited
anxiously for a response. Eventually they received the verdict: Decca had
decided to pass. In a
letter
to Epstein, Dick Rowe, a prominent talent scout at Decca Records,
wrote, “We don’t like your boys’ sound. Groups are out; four-piece groups with
guitars, particularly, are finished.”
As Dick
Rowe would soon learn, the fourth villain of decision making is overconfidence.
People think they know more than they do about how the future will unfold.
Recall that Andy Grove’s colleagues had dire predictions of what would happen if
Intel stopped making memory chips. We will lose the
seedbed of our R&D. Our sales force can’t
succeed without a full line of products. History
proves that they were wrong: Intel’s R&D and sales stayed strong. But
what’s interesting is that, at the time they made these proclamations, they
didn’t feel uncertain. They weren’t hedging their remarks by saying, “It’s
possible that…” or “I just worry that this could happen someday…” They knew
they were right. They just knew it.
Elsewhere,
one study showed that when doctors reckoned themselves “completely certain”
about a diagnosis, they were wrong 40 per cent of the time. And in another,
when a group of students made estimates that they believed had only a one percent
chance of being wrong, they were actually wrong 27 percent of the time. We have
too much confidence in our own predictions because when we make guesses about
the future, we shine our spotlights on information that is close at hand, and
then we draw conclusions from that information. Overconfidence is an insidious villain.
Counteracting
Our Biases
If you
think about a normal decision process, it usually proceeds in four steps:
1. You
encounter a choice;
2. You
analyze your options;
3. You
make a choice;
4. Then you live with it.
What
we’ve seen is that there is a villain that afflicts each of these stages:
• You
encounter a choice: but narrow framing makes you miss options;
• You
analyze your options: but the confirmation bias leads you to gather
self-serving info;
• You
make a choice: but short-term emotion will often tempt you to make the
wrong one;
• Then
you live with it: but you’ll often be overconfident about how the future
will unfold.
We
can’t deactivate our biases, but with the
right discipline, we can counteract them. The nature of each villain actually
suggests a strategy for defeating it.
1. You
encounter a choice; but narrow framing makes you miss options.
So…WIDEN
Your Options.
2. You
analyze your options; but the confirmation bias leads you to gather
self-serving info.
So...REALITY-TEST
Your Assumptions.
3. You
make a choice; but short-term emotion will often tempt you to make the wrong
one.
So...ATTAIN
Distance Before Deciding.
4. Then
you live with it; but you’ll often be overconfident about how the future will
unfold.
So…PREPARE
to be Wrong.
The
four steps in our ‘WRAP model’ are sequential — in general, you can follow them
in order — but not rigidly so. For instance, a long-awaited promotion probably
won’t require much ‘Widening’ or ‘Distance’ before you accept and pop the
champagne.
In
closing
At its
core, the WRAP model urges you to switch from ‘autospotlight’ to ‘manual
spotlight’. Rather than make choices based on what naturally comes to your
attention — visceral emotions, self-serving information, overconfident predictions,
and so on — you deliberately illuminate more strategic spots. You sweep your
light over a broader landscape and point it
into hidden corners.
Our
book chronicles several heroes who have successfully foiled the villains of
decision making. For instance, most companies face frequent disappointments in their
hiring process, but we found a few that have tweaked their process in ways that
strongly reduce the failure rate. We interviewed a graduate student who
survived a life-threatening illness, in part by adopting a better process to
collect information from his physicians. And we discovered a simple change in
process that made the strategic decisions of a technology firm six times more likely
to be rated as ‘very successful’ ten years later. All of these cases suggest
that savvy organizations and individuals can overcome the villains by adopting
the right process.
Our
decisions will never be perfect, but they can be better, bolder and wiser. The
right process can steer us toward the right choice. And the right choice, at
the right moment, can make all the difference.
Chip
Heath is the Thrive Foundation for Youth Professor of
Organizational Behaviour at the Stanford Graduate School of Business. He is the
co-author, with his brother Dan, of three books, most recently, Decisive:
How to Make Better Decisions in Life and Work (Crown
Business, 2013).
Dan
Heath is a Senior Fellow at Duke University’s CASE center, which supports
social entrepreneurs. Excerpted from Decisive.
Copyright © 2013 Chip and Dan Heath.
Published by Random House Canada, an imprint of the Knopf Random Canada
Publishing Group, which is a division of Random House of Canada Limited.
Reproduced by arrangement with the Publisher.
All rights reserved.
No comments:
Post a Comment