- Home
- Mauboussin, Michael
Think Twice: Harnessing the Power of Counterintuition
Think Twice: Harnessing the Power of Counterintuition Read online
Copyright
Copyright 2009 Michael J. Mauboussin
All rights reserved
No part of this publication may be reproduced, stored in or introduced into a retrieval system, or transmitted, in any form, or by any means (electronic, mechanical, photocopying, recording, or otherwise), without the prior permission of the publisher. Requests for permission should be directed to [email protected], or mailed to Permissions, Harvard Business School Publishing, 60 Harvard Way, Boston, Massachusetts 02163.
First eBook Edition: November 2009
ISBN: 978-1-4221-7675-7
To Al Rappaport
Mentor, collaborator, friend
CONTENTS
Copyright
Acknowledgments
Introduction
Smart Is as Smart Does
Chapter 1
The Outside View
Why Big Brown Was a Bad Bet
Chapter 2
Open to Options
How Your Telephone Number Can Influence Your Decisions
Chapter 3
The Expert Squeeze
Why Netflix Knows More Than Clerks Do About Your Favorite Films
Chapter 4
Situational Awareness
How Accordion Music Boosts Sales of Burgundy
Chapter 5
More Is Different
How Bees Find the Best Hive Without a Real Estate Agent
Chapter 6
Evidence of Circumstance
How Outsourcing the Dreamliner Became Boeing’s Nightmare
Chapter 7
Grand Ah-Whooms
How Ten Brits Made the Millennium Bridge Wobble
Chapter 8
Sorting Luck from Skill
Why Investors Excel at Buying High and Selling Low
Conclusion
Time to Think Twice
How You Can Change Your Decision Making Immediately
Notes
Bibliography
About the Author
ACKNOWLEDGMENTS
I am deeply grateful to be in the position to learn from thoughtful and wonderful people. The individuals who encouraged, guided, and taught me in the process of writing Think Twice made the long journey rich and fulfilling.
My colleagues at Legg Mason Capital Management have been terrific, providing valuable support and cooperation. That I was able to write this book during a difficult time in the markets is a testament to the organization’s commitment to learning. In particular, Bill Miller and Kyle Legg provided me with the needed flexibility to take on this challenge, and I hope that I can repay their confidence in full.
A number of people graciously shared their time and knowledge with me. They told me stories, clarified points, or aimed me in the right direction when I was off track. These include Orley Ashenfelter, Greg Berns, Angela Freymuth Caveney, Clayton Christensen, Katrina Firlik, Brian Roberson, Phil Rosenzweig, Jeff Severts, Thomas Thurston, and Duncan Watts.
Getting ideas right is not always easy. I was privileged to have a small group, each a leader in his field, read and critique sections of the book. My thanks to Steven Crist, Scott Page, Tom Seeley, Stephen Stigler, Steve Strogatz, and David Weinberger.
The Santa Fe Institute has been a tremendous source of learning and inspiration for me. SFI takes a multidisciplinary approach to understanding the common themes that arise in complex systems. The institute attracts people who are intellectually curious and collaborative, and I am appreciative of the willingness of the scientists, staff, and network members to share so much with me. Particular thanks go to Doug Erwin, Shannon Larsen, John Miller, Scott Page, and Geoffrey West.
Reading a draft of a manuscript and providing feedback to the author is difficult and time-consuming. I was fortunate to have a stellar collection of folks, from many walks of life, help me out. These include Paul DePodesta, Doug Erwin, Dick Foster, Michelle Mauboussin, Bill Miller, Michael Persky, Al Rappaport, David Shaywitz, and three anonymous reviewers. Thanks to all of you for your valuable time and very helpful recommendations.
I have long been an admirer of Daniel Kahneman’s work. But in doing the research for this book, my respect for his contributions to psychology in general and to decision making in particular grew immensely. He is deservedly a towering figure in psychology, and his work touched almost every idea in this book.
I want to offer special recognition to my friend Laurence Gonzales. Over the years, Laurence and I have asked many of the same questions about decision making. But because he has a very different background and set of experiences than me, he has opened my eyes to many new and useful points of view. Just his willingness to share his thoughts with me has put me in his debt.
But Laurence, a very talented writer, went well beyond swapping ideas with me. Upon receiving a draft of the book, he made the extraordinary offer to edit it in full. Working through his comments was some of the hardest and most rewarding work I have ever done. He taught me about the craft of writing, compelled me to sharpen my thinking, and insisted on clarity. With Laurence, the writing serves the ideas and not the other way around. I cannot thank you enough, Laurence.
Dan Callahan, a colleague at Legg Mason Capital Management, was integral to the project. Dan provided essential research support and coordinated the exhibits. Most importantly, he read numerous drafts of the chapters and provided useful feedback. He also did much of this while juggling other responsibilities, going above and beyond the call of duty. Dan, a big thanks to you.
I would also like to acknowledge A. J. Alper, who thought of the title (including the “thin ice” angle) and graciously allowed me to use it. A. J. is a pleasure to work with and has a great balance between creativity and business sense.
I’d like to thank Kirsten Sandberg, my editor at Harvard Business Press, for shepherding this project from a brainstorming session to a final product. Volleying ideas with Kirsten is always beneficial, and her feedback sharpened the manuscript in important ways. I appreciate Kirsten for never letting me lose sight of our audience, our message, and how to connect the two. Ania Wieckowski was also great throughout the editorial process in matters big and small, and Jen Waring made the production smooth and efficient.
My wife, Michelle, is a constant source of love, support, and counsel. She also encourages me, and allows me, to pursue my passions. Michelle’s comments about the initial manuscript were direct and on point in a way that only a wife could pull off. My mother, Clotilde Mauboussin, has been a steadfast force in my life and provided me with all of the opportunities I could have asked for. My mother-in-law, Andrea Maloney Schara, is part our family’s day-today life and maintains an admirable thirst for learning. Finally, I thank my children Andrew, Alex, Madeline, Isabelle, and Patrick. Each of them helped me to write this book in some way, and I hope that they will find the ideas useful in their lives some day.
INTRODUCTION
Smart Is as Smart Does
IN DECEMBER 2008, two seemingly unrelated events occurred. The first was the release of Stephen Greenspan’s book, Annals of Gullibility: Why We Get Duped and How to Avoid It. Greenspan, a professor of psychology, explained why we allow other people to take advantage of us and discussed gullibility in fields including finance, academia, and the law. He ended the book with helpful advice on becoming less gullible.
The second was the exposure of the greatest Ponzi scheme in history, run by Bernard Madoff, which cost its unsuspecting investors in excess of $60 billion. A Ponzi scheme is a fraudulent operation in which a manager uses funds from new investors to pay off old investors. Since there is no legitimate investment
activity, it collapses when the operator can’t find enough additional investors. Madoff’s scheme unraveled when he couldn’t meet requests for redemptions from the investors stung by the financial meltdown.
The irony is that Greenspan, who is bright and well regarded, lost 30 percent of his retirement savings in Madoff’s Ponzi scheme.1 The guy who wrote the book on gullibility got taken by one of the greatest scammers of all time. In fairness, Greenspan didn’t know Madoff. He invested in a fund that turned the money over to the scheme. And Greenspan has been gracious in sharing his story and explaining why he was drawn to investment returns that looked, in retrospect, too good to be true.
If you ask people to offer adjectives they associate with good decision makers, words like “intelligent” and “smart” are generally at the top of the list. But history contains plenty of examples of intelligent people who made poor decisions, with horrific consequences, as the result of cognitive mistakes. Consider the following:
• In the summer of 1998, Long-Term Capital Management (LTCM), a U.S. hedge fund, lost over $4 billion and had to be bailed out by a consortium of banks. The senior professionals at LTCM, who included two recipients of the Nobel Prize in economics, were highly successful up to that point. As a group, the professionals were among the most intellectually impressive of any organization in the world and were large investors in their own fund. They failed because their financial models didn’t sufficiently consider large asset price fluctuations.2
• On February 1, 2003, the U.S. space shuttle Columbia disintegrated when reentering the Earth’s atmosphere, killing all seven crew members. The engineers at the National Aeronautics and Space Administration (NASA) were considered the best and the brightest in the world. Columbia broke up because a piece of foam insulation broke off during the launch and damaged the shuttle’s ability to protect itself from heat on reentry. The problem of foam debris was not new, but since nothing bad had happened in the past, the engineers overlooked the issue. Rather than considering the risks from the debris, NASA took the lack of problems as evidence that everything was okay.3
• Within a few weeks in the fall of 2008, Iceland’s three largest banks collapsed, its currency dropped by more than 70 percent, and its stock market lost more than 80 percent of its value. After the banking sector was privatized in 2003, the large banks increased their assets from roughly one times Iceland’s GDP to close to ten times GDP, “the most rapid expansion of a banking system in the history of mankind.” Known as a country of well-educated and measured people, Iceland’s citizens went on a debt-fueled spending spree, driving up asset prices. While each person might be able to rationalize his own decisions, collectively the country ran off an economic cliff.4
No one wakes up thinking, “I am going to make bad decisions today.” Yet we all make them. What is particularly surprising is some of the biggest mistakes are made by people who are, by objective standards, very intelligent. Smart people make big, dumb, and consequential mistakes.
Keith Stanovich, a psychologist at the University of Toronto, argues that the intelligence quotient (IQ) tests we rely on to judge who is smart do not measure the essential raw materials for making quality decisions. “Although most people would say that the ability to think rationally is a clear sign of superior intellect,” he writes, “standard IQ tests devote no section to rational thinking.”5 Mental flexibility, introspection, and the ability to properly calibrate evidence are at the core of rational thinking and are largely absent on IQ tests.
Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us, and that software isn’t designed to cope with many of today’s problems. So our minds frequently want to see the world one way—the default—while a better way to see the world takes some mental effort. A simple example is an optical illusion: you perceive one image while the reality is something different.
Beyond the problem of mental software, smart people also make bad decisions because they harbor false beliefs. For instance, Sir Arthur Conan Doyle, known best as the creator of the detective Sherlock Holmes, believed in many forms of spiritualism, such as the existence of fairies. These beliefs prevent clear thinking. To make good decisions, you frequently must think twice—and that’s something our minds would rather not do.
This focus on mistakes may sound depressing, but this book is really a story of opportunity. The opportunity comes in two flavors. You can reduce the number of mistakes you make by thinking about problems more clearly. According to research by Stanovich and others, if you explain to intelligent people how they might go wrong with a problem before they decide, they do much better than if they solve the problem with no guidance. “Intelligent people perform better only when you tell them what to do!” exclaims Stanovich. Also, you can see and capitalize on the mistakes that other people make. As shrewd businesspeople know, one man’s mistake is another’s opportunity. Over time, the most rational thinker will win. Think Twice is about identifying these opportunities.
I will take you through three steps:
• Prepare. The first step is mental preparation, which requires you to learn about the mistakes. In each chapter, I discuss a mistake, with examples from a range of professions, and use academic research to explain why you make them. I also examine how these mistakes have big consequences. Even though they have the best of intentions, investors, businesspeople, doctors, lawyers, government officials, and other professionals routinely botch their decisions, often with extremely high costs.
• Recognize. Once you are aware of the categories of mistakes, the second step is to recognize the problems in context, or situational awareness. Here, your goal is to recognize the kind of problem you face, how you risk making a mistake, and which tools you need to choose wisely. Mistakes generally arise from the mismatch between the complex reality you face and the simplifying mental routines you use to cope with that complexity. The challenge is to make intellectual links between realms that may not appear similar on the surface. As you will see, a multidisciplinary approach can yield great insight into decision making.
• Apply. The third and most important step is to mitigate your potential mistakes. The goal is to build or refine a set of mental tools to cope with the realities of life, much as an athlete develops a repertoire of skills to prepare for a game. Many of these tips involve keeping your intuition in check while using an approach that feels counterintuitive.
By the way, I claim no immunity from these cognitive mistakes and still fall for every one that I describe in the book. My personal goal is to recognize when I enter a danger zone while trying to make a decision and to slow down when I do. Finding the proper point of view at the appropriate time is critical.
Prepare, Recognize, Apply—and Win a T-Shirt
Like many instructors of finance, I do experiments with my students to show how bright people fall into traps while making decisions. In one experiment, I present the class with a jar of coins and ask everyone to bid independently on the value of its contents. Most students bid below the actual value, but some bid well above the coins’ worth. The highest bidder wins the auction but overpays for the coins. This is known as “the winner’s curse.” It’s important in corporate mergers and acquisitions, because when companies bid against one another to buy a target corporation, the highest bidder frequently pays too much. The experiment gives the students (doubly so the winner) firsthand experience.6
To spice up such experiments, instructors often turn them into contests with prizes for the individuals who do best. I attended a two-day conference on “Investment Decisions and Behavioral Finance” at Harvard University that included a couple of those contests. My reading and teaching had familiarized me with these experiments. When I first tried them, I did poorly—below average. But I then studied the principles, practiced identifying the problems, and learned the techniques to approach them properly.
The first was a test of overconfidence. Richard Zeckhauser, a political sci
entist at Harvard University and a champion bridge player, gave each participant a list of ten unusual questions of fact (e.g., gestation period of an Asian elephant) and asked for both a best guess and a high and low estimate, bounding the correct answer with 90 percent confidence. For example, I might reason that an elephant’s gestation is longer than a human’s and guess fifteen months. I might also feel 90 percent assured that the answer is somewhere between twelve and eighteen months. If my ability matches my confidence, then I would expect the correct answers to fall within that range nine times out of ten. But, in fact, most people are correct only 40 to 60 percent of the time, reflecting their overconfidence.7 Even though I didn’t know the answers to those ten questions, I had a sense of where I might go wrong, so I adjusted my initial estimates. I won that contest and got a book.
The second experiment showed the failure of pure rationality. Here, Richard Thaler, one of the world’s foremost behavioral economists, asked us to write down a whole number from zero to one hundred, with the prize going to the person whose guess was closest to two-thirds of the group’s average guess. In a purely rational world, all participants would coolly carry out as many levels of deduction as necessary to get to the experiment’s logical solution—zero. But the game’s real challenge involves considering the behavior of the other participants. You may score intellectual points by going with naught, but if anyone selects a number greater than zero, you win no prize. The winning answer, incidentally, is generally between eleven and thirteen.8 I won that contest too, and got a T-shirt.