- Home
- Mauboussin, Michael
Think Twice: Harnessing the Power of Counterintuition Page 5
Think Twice: Harnessing the Power of Counterintuition Read online
Page 5
But self-justification is a big problem if the stakes are high. A scan of history shows the deplorable behavior of malevolent dictators, religious extremists, and cheating executives who justified their own behavior as they harmed others. Here are some examples that show the lengths the mind will go to in order to resolve internal conflict.
While other eighth graders hoped to be astronauts or firefighters, Kurt Wise dreamed of getting a PhD from Harvard and teaching at a big university. After receiving an undergraduate degree from the University of Chicago, Wise realized part of his dream by earning a doctorate at Harvard in geology as a student of Stephen Jay Gould, the well-known paleobiologist. Wise’s thesis complemented the fossil record by offering a statistical method to infer the period in which a particular species lived, often millions of years ago. His contribution was fully consistent with well-established evolutionary theory.
Why mention Wise’s credentials? In direct contradiction to his scientific training, Wise is a “young earth creationist,” someone who believes the literal Bible account of God creating the earth only a few thousand years ago. The conflict in Wise’s mind boiled to the point where he decided to painstakingly go through the Bible and cut out every verse that was inconsistent with evolutionary theory. The project took Wise months, and when he finished, the truth he dreaded was clear: there was not much left of the Bible. So he had to decide between evolution and scripture. He chose scripture. As he recounts, “It was there that night that I accepted the Word of God and rejected all that would ever counter it, including evolution.” He went on, “If all of the evidence in the universe turned against creationism, I would be the first to admit it, but I would still be a creationist. Here I must stand.”16
In the mid-1950s, a trio of scientists including Festinger took note of a small Illinois-based group that claimed to receive lessons from spiritual beings on other planets. The scientists infiltrated the group and gathered firsthand accounts of the meetings and events. Over time, the members came to believe that beings from outer space had shared an ominous foreboding about a world-destroying flood to occur on December 21. The good news was that spaceships would descend at midnight and rescue the cult’s believers.
Prior to the doomsday date, the cult members displayed two seemingly contradictory behaviors. On one hand, they maintained or escalated their commitment to the group by quitting their jobs, stopping their studies, and giving away their possessions in anticipation of their new lives. On the other hand, they did little to help outsiders, beyond sharing vague news of impending disaster.
On the evening of December 20, the believers gathered at the house of Marian Keech, one of the group’s spiritual leaders, waiting for the spacemen. When midnight came and passed, the members grew unsettled. During a break after 4 a.m., Thomas Armstrong, also a cult leader, confided in one of the infiltrators, “I don’t care what happens tonight. I can’t afford to doubt. I won’t doubt even if we have to make an announcement to the press tomorrow and admit we were wrong.”17 Note the striking similarity between Wise’s and Armstrong’s comments.
Around 4:45 a.m., a message came to Mrs. Keech: the cult had been such a “force of Good” that the world was spared its awful fate. This provided a spark for the group’s spirit, but a second missive shortly thereafter made an even bigger difference. The cult was to release this “Christmas message” to the newspapers immediately and in complete detail. So the exhausted members, led by Mrs. Keech, started calling newspapers, radio stations, and wire services. From then on, the cult opened up. “The house was crowded with the nowwelcome horde of newspaper, radio, and television representatives,” the scientists wrote, “and visitors streamed in and out the door.”18
While cognitive dissonance is about internal consistency, the confirmation bias is about external consistency. The confirmation bias occurs when an individual seeks information that confirms a prior belief or view and disregards, or disconfirms, evidence that counters it.19 Robert Cialdini, a social psychologist at Arizona State University, notes that consistency offers two benefits. First, it permits us to stop thinking about an issue, giving us a mental break. Second, consistency frees us from the consequence of reason—namely, changing our behavior. The first allows us to avoid thinking; the second to avoid acting.20
When radio became popular in the 1920s and 1930s in the United States, some psychologists worried that the media would infect a vulnerable public with ideas. The fear was that everyone hearing the same message simultaneously might spark some massive, unintended, coordinated behavior. Elihu Katz and Paul Lazarsfeld, prominent sociologists, refuted this view. Their work showed that people pretty much kept on doing what they were doing before, irrespective of the media.21
When Katz and Lazarsfeld studied why the media had such a muted influence on individuals, they discovered that people are selective in their exposure and retention. In effect, most people see and hear what they want and tune out everything else. For example, a government memo explained the demands of former Vice President Dick Cheney when he went to a hotel. These included four cans of Diet Sprite, a pot of decaffeinated coffee, a room temperature of sixty-eight degrees, and all televisions tuned to Fox News, the channel that most closely reflected his views.22 This facet of the confirmation bias, selective exposure and retention, minimizes our exposure to diverse ideas.
Drew Westen, a psychologist at Emory University, and his colleagues did a study of selective exposure and retention among political partisans. The researchers gave staunch Democrats and Republicans surveys and later put them in an fMRI machine and scanned their brains as they read slides. The statements included clearly inconsistent comments from Democratic and Republican presidential candidates and some politically neutral people.
The partisans had no problem seeing the contradictions in the opposition candidate, giving scores near four on a four-point discrepancy rating. But when their own candidate was inconsistent, the average discrepancy rating was closer to two, suggesting they saw minimal contradiction. Finally, neither Democrats nor Republicans reacted strongly to the neutral contradictions (see figure 2–3).23
FIGURE 2-3
Partisans notice discrepancies in the other party but not in their own
Source: Drew Westen, Pavel S. Blagov, Keith Harenski, Clint Kilts, and Stephan Hamann, “Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election,” Journal of Cognitive Neuroscience 18, no. 11 (2006): 1951.
The brain images were equally revealing and followed a similar pattern. When the partisans saw information they didn’t agree with, none of the circuits involved in conscious reasoning were very active. But when they saw what they liked, their brains eliminated negative emotional states and activated positive ones. The brains of the partisans massively reinforced what they already believed.24
The study of political partisans shows the large role that attention plays in tunnel vision. Paying a lot of attention to one thing means you are not paying a lot of attention to others, often creating a form of blindness. Every year, I show a video to my class that demonstrates this phenomenon. Daniel Simons and Christopher Chabris, psychologists who study perception, created the now famous thirty-second video that shows two teams, one wearing white shirts and the other black, in a nondescript lobby. Each team passes a basketball back and forth. I ask the students to count the number of passes the white team makes, which is somewhat challenging because the players move around. Of course, the students know there’s some trick, so they concentrate their attention on the task.
There is a trick. Roughly halfway through the video, a woman wearing a gorilla suit walks into the middle of the scene, thumps her chest, and walks off. Less than 60 percent of students concentrating on the challenging visual task notice the gorilla (see figure 2–4). I then rerun the video and ask the students to watch it unencumbered by the task. There are always nervous chuckles when the gorilla makes her appearance. My results are very
consistent with what other experimenters report.
Let’s face it: we all have finite attention bandwidths. If you dedicate all that bandwidth to one task, none is left over for anything else. So people should be alert to striking a balance between nitty-gritty problem solving and a broader context.25
There’s something else that contributes to tunnel vision, and it’s something we can all relate to in varying degrees—stress. Like a lot of things in life, a little bit of stress (or a lot for a very short time) is a good thing. But too much stress can muddle our thinking by clipping our ability to think long term.
Stress is often very helpful. The classic stress response mobilizes energy to your muscles by increasing your heart rate, blood pressure, and breathing. High stress also helps your sensory system. For example, policemen report that during shootouts their visual acuity and focus improves, they sense a slowdown in time, and they fail to hear sounds. For a short burst, the mind can focus intently on the task at hand. This reaction is valuable in extraordinary circumstances.26
Stress is bad, however, if it is constant. Animals have a stress response when they are faced with physical threats—imagine a lion is chasing a zebra—but calm down once the threat passes. While humans are periodically threatened physically, most of our stress comes from the emotional strains of job deadlines, financial worries, and relationship issues. Crucially, the stress response is the same whether it comes from physical or psychological provocation. And, unlike most of the animal kingdom, we can experience chronic psychological stress. Events turn on our stress response system and we can’t turn it off. While mobilizing your body to respond to a short-term threat is an amazing feat, the same response is deeply detrimental to your health if it is always on.
FIGURE 2-4
A majority of viewers fail to see the gorilla
Source: D. J. Simons and C. F. Chabris, “Gorillas in our midst: Sustained inattentional blindness for dynamic events,” Perception 28 (1999): 1059–1074. Figure provided by Daniel Simons. The video depicted in this figure is available as part of a DVD from Viscog Productions (http://www.viscog.com).
Robert Sapolsky, a neurobiologist at Stanford University and an expert on stress, notes that an important feature of the stress response is that it turns off long-term systems. You need not worry about your digestion, growth, disease prevention, or reproduction if you are about to be a lion’s lunch. The stress response is, in Sapolsky’s words, “penny-wise and dollar foolish.” And this plays into tunnel vision.
Stressed people struggle to think about the long term. The manager about to lose her job tomorrow has little interest in making a decision that will make her better off in three years. Psychological stress creates a sense of immediacy that inhibits consideration of options with distant payoffs. The stress response, so effective for dealing with here-and-now risks, co-opts the decision-making apparatus and compels poor decisions.27
Incentives, or What’s in It for Me?
Incentives matter, as economists have argued quite compellingly. An incentive is any factor, financial or otherwise, that encourages a particular decision or action. In many situations, incentives create a conflict of interest that compromises a person’s ability to properly consider alternatives. So when you evaluate your own decisions or the decisions of others, consider the choices that the incentives encourage.
Dr. Katrina Firlik, a neurosurgeon, shared an example: at conference dealing with spine surgery, a surgeon presented the case of a female patient with a herniated disc in her neck and pain that was caused by a pinched nerve. She had already failed typical conservative treatments such as physical therapy, medication, and waiting it out.
The surgeon asked the audience to vote on a couple of choices for surgery. The first was the newer anterior approach, where the surgeon removes the entire disc, replaces it with a bone plug, and fuses the discs. The vast majority of the hands shot up. The second choice was the older posterior approach, where the surgeon removes only the portion of the disc that is compressing the nerve. No fusion is required because the procedure leaves most of the disc intact. Only a few audience members raised their hands.
The speaker then asked the audience, which was almost entirely male, “What if this patient is your wife?” The show of hands was reversed for the same two choices. The main reason is that the amount surgeons are paid for the newer and more complicated procedure is typically several times what they’d receive for the older procedure.28
Incentives also played a central role in the financial crisis of 2007–2009. Take the subprime mortgage market, which was “undeniably the original source of crisis,” according to Alan Greenspan, former chairman of the Federal Reserve. People unable to meet prime credit standards because of a poor or limited credit history were able to borrow unprecedented amounts of money, often at low initial interest rates. Subprime mortgages went from about 10 percent of new mortgages in the late 1990s to 20 percent by 2006, and unregulated lenders represented the bulk of that volume. These subprime borrowers were the first to run into trouble when home prices dropped, triggering a cascade of losses throughout the financial system.
While letting the subprime mortgage market grow as it did was clearly bad, incentives for the participants strongly encouraged it. For example:
• People with poor credit standards could own the nice homes they coveted.
• Lenders earned fees on the loans that they made, encouraging them to relax underwriting standards. They also did not hold on to the mortgages for the most part, so their incentives were primarily to grow rather than to lend prudently.
• Investment banks bought the individual mortgages and bundled them for resale to other investors, earning a fee.
• Rating agencies were paid a fee to rate the mortgage-backed securities. They issued a good number of AAA-ratings, suggesting a high level of creditworthiness.
• Investors in AAA-rated mortgage-backed securities earned higher returns than they did on other AAA issues. Since many of those investors were paid based on portfolio performance, the additional yield led to higher fees.29
The subprime mess revealed that what may appear to be optimal for the individual agents in a complex system may be suboptimal for the system as a whole. Even in the wake of the debacle, we can easily see the motivations of each constituent in the chain: more homes, more fees, more yield. On one level, those motivations make sense. But when all the participants pursued their goals without giving thought to the broader impact on the housing markets and the financial system, the system collapsed. For fervent believers in markets, this collective failure was especially stunning. Greenspan wrote, “Those of us who have looked to the self-interest of lending institutions to protect shareholder’s equity (myself included) are in a state of shocked disbelief.”30
Many poor decisions result from inappropriate incentives rather than mistakes. The biases that come with incentives are often subconscious. Max Bazerman, a professor at the Harvard Business School who studies decision making, and some fellow researchers asked over one hundred accountants to review five ambiguous accounting vignettes and to judge the accounting for each. Half the accountants were told that they had been hired by the company, and the rest were told they were hired by a different company. Those who played the role of auditor for the company were 30 percent more likely to find the choices compliant with accounting principles, suggesting that even a hypothetical relationship with the company shaped judgment. The researchers wrote, “Perhaps the most notable feature of the psychological processes at work in conflicts of interest is that they can occur without any conscious intention to indulge in corruption.” Incentives are a strong contributor to tunnel vision.31
How do you avoid the tunnel vision trap? Here’s a five-point checklist:
1. Explicitly consider alternatives. As Johnson-Laird’s model of reasoning suggests, decision makers often fail to consider a sufficient number of alternatives. You should examine a full range of alternatives, using base rates or market-der
ived guidelines when appropriate to mitigate the influence of the representativeness or availability biases.
To this end, negotiation teachers suggest entering talks knowing your best alternative to a negotiated agreement, your walkaway price, and the same two sums for the party across the table. These figures allow you to improve the odds of an advantageous deal and to avoid being surprised. In other settings, too, enumerating your alternatives clearly and completely is very helpful.32
2. Seek dissent. Much easier said than done, the idea is to prove your views wrong. There are a couple of techniques. The first is to ask questions that could elicit answers that might contradict your own views. Then listen carefully to the answers. Do the same when canvassing data: look for reliable sources that offer conclusions different than yours. This helps avoid a foolish inconsistency.33
When possible, surround yourself with people who have dissenting views. This is emotionally and intellectually very difficult, but is highly effective in exposing alternatives. It also reduces the risk of group think, when group members try to reach consensus with minimal conflict by avoiding testing alternative ideas. Abraham Lincoln embodied this approach. After his unlikely ascent to the White House, Lincoln appointed a number of his eminent foes to cabinet positions. He ended up winning the respect of his former adversaries, as his team of rivals navigated the United States through the Civil War.34