Авторизация
×

Think Again: The Power of Knowing What You Don't Know / Подумай еще раз: сила знания о незнании (by Adam Grant, 2021) - аудиокнига на английском

чтобы убрать рекламу сделайте регистрацию/авторизуйтесь на сайте

Think Again: The Power of Knowing What You Don't Know / Подумай еще раз: сила знания о незнании (by Adam Grant, 2021) - аудиокнига на английском

Think Again: The Power of Knowing What You Don't Know / Подумай еще раз: сила знания о незнании (by Adam Grant, 2021) - аудиокнига на английском

Интеллект обычно рассматривается как способность думать и учиться, но в быстро меняющемся мире есть еще один набор когнитивных навыков, которые могут иметь большее значение: способность переосмысливать и отучаться. В нашей повседневной жизни слишком многие из нас предпочитают комфорт убежденности дискомфорту сомнения. Мы прислушиваемся к мнению, которое заставляет нас чувствовать себя хорошо, а не к идеям, которые заставляют нас усердно думать. Мы рассматриваем несогласие как угрозу нашему эго, а не как возможность учиться. Мы окружаем себя людьми, согласными с нашими выводами, в то время как должны тяготеть к тем, кто бросает вызов нашему мыслительному процессу. В результате наши убеждения становятся хрупкими задолго до наших костей. Мы слишком похожи на проповедников, защищающих наши священные убеждения, прокуроров, доказывающих неправоту другой стороны, и политиков, борющихся за одобрение, - и слишком мало на ученых, ищущих истину.

Рейтинг:
Просмотров: 3 125
Название:
Think Again: The Power of Knowing What You Don't Know / Подумай еще раз: сила знания о незнании (by Adam Grant, 2021) - аудиокнига на английском
Год выпуска аудиокниги:
2021
Автор:
Adam Grant
Исполнитель:
Adam Grant
Язык:
английский
Жанр:
Аудиокниги на английском языке / Аудиокниги жанра мотивация на английском языке / Аудиокниги жанра психология на английском языке / Аудиокниги жанра саморазвитие на английском языке / Аудиокниги уровня upper-intermediate на английском
Уровень сложности:
upper-intermediate
Длительность аудио:
06:40:56
Битрейт аудио:
64 kbps
Формат:
mp3, pdf, doc

Слушать онлайн Think Again: The Power of Knowing What You Don't Know / Подумай еще раз: сила знания о незнании аудиокнигу на английском языке:

Скачать текст книги в формате .doc (Word) по прямой ссылке adam_grant_-_think_again.doc [25.85 Mb] (cкачиваний: 85) .
Скачать текст книги в формате .pdf по прямой ссылке  adam_grant_-_think_again.pdf [10.26 Mb] (cкачиваний: 174) .
Скачать audiobook (MP3) бесплатно с файлообменника.


Слушать аудиокнигу в смартфоне через телеграм: Think Again: The Power of Knowing What You Don't Know

Читать книгу на английском онлайн:

(Чтобы переводить слова на русский язык и добавлять в словарь для изучения, щелкаем мышкой на нужное слово).


Prologue A fter a bumpy flight, fifteen men dropped from the Montana sky. They weren’t skydivers. They were smokejumpers: elite wildland firefighters parachuting in to extinguish a forest fire started by lightning the day before. In a matter of minutes, they would be racing for their lives. The smokejumpers landed near the top of Mann Gulch late on a scorching August afternoon in 1949. With the fire visible across the gulch, they made their way down the slope toward the Missouri River. Their plan was to dig a line in the soil around the fire to contain it and direct it toward an area where there wasn’t much to burn. After hiking about a quarter mile, the foreman, Wagner Dodge, saw that the fire had leapt across the gulch and was heading straight at them. The flames stretched as high as 30 feet in the air. Soon the fire would be blazing fast enough to cross the length of two football fields in less than a minute. By 5:45 p.m. it was clear that even containing the fire was off the table. Realizing it was time to shift gears from fight to flight, Dodge immediately turned the crew around to run back up the slope. The smokejumpers had to bolt up an extremely steep incline, through knee-high grass on rocky terrain. Over the next eight minutes they traveled nearly 500 yards, leaving the top of the ridge less than 200 yards away. With safety in sight but the fire swiftly advancing, Dodge did something that baffled his crew. Instead of trying to outrun the fire, he stopped and bent over. He took out a matchbook, started lighting matches, and threw them into the grass. “We thought he must have gone nuts,” one later recalled. “With the fire almost on our back, what the hell is the boss doing lighting another fire in front of us?” He thought to himself: That bastard Dodge is trying to burn me to death. It’s no surprise that the crew didn’t follow Dodge when he waved his arms toward his fire and yelled, “Up! Up this way!” What the smokejumpers didn’t realize was that Dodge had devised a survival strategy: he was building an escape fire. By burning the grass ahead of him, he cleared the area of fuel for the wildfire to feed on. He then poured water from his canteen onto his handkerchief, covered his mouth with it, and lay facedown in the charred area for the next fifteen minutes. As the wildfire raged directly above him, he survived in the oxygen close to the ground. Tragically, twelve of the smokejumpers perished. A pocket watch belonging to one of the victims was later found with the hands melted at 5:56 p.m. Why did only three of the smokejumpers survive? Physical fitness might have been a factor; the other two survivors managed to outrun the fire and reach the crest of the ridge. But Dodge prevailed because of his mental fitness. WHEN PEOPLE REFLECT on what it takes to be mentally fit, the first idea that comes to mind is usually intelligence. The smarter you are, the more complex the problems you can solve—and the faster you can solve them. Intelligence is traditionally viewed as the ability to think and learn. Yet in a turbulent world, there’s another set of cognitive skills that might matter more: the ability to rethink and unlearn. Imagine that you’ve just finished taking a multiple-choice test, and you start to second-guess one of your answers. You have some extra time—should you stick with your first instinct or change it? About three quarters of students are convinced that revising their answer will hurt their score. Kaplan, the big test-prep company, once warned students to “exercise great caution if you decide to change an answer. Experience indicates that many students who change answers change to the wrong answer.” With all due respect to the lessons of experience, I prefer the rigor of evidence. When a trio of psychologists conducted a comprehensive review of thirty-three studies, they found that in every one, the majority of answer revisions were from wrong to right. This phenomenon is known as the first-instinct fallacy. In one demonstration, psychologists counted eraser marks on the exams of more than 1,500 students in Illinois. Only a quarter of the changes were from right to wrong, while half were from wrong to right. I’ve seen it in my own classroom year after year: my students’ final exams have surprisingly few eraser marks, but those who do rethink their first answers rather than staying anchored to them end up improving their scores. Of course, it’s possible that second answers aren’t inherently better; they’re only better because students are generally so reluctant to switch that they only make changes when they’re fairly confident. But recent studies point to a different explanation: it’s not so much changing your answer that improves your score as considering whether you should change it. We don’t just hesitate to rethink our answers. We hesitate at the very idea of rethinking. Take an experiment where hundreds of college students were randomly assigned to learn about the first-instinct fallacy. The speaker taught them about the value of changing their minds and gave them advice about when it made sense to do so. On their next two tests, they still weren’t any more likely to revise their answers. Part of the problem is cognitive laziness. Some psychologists point out that we’re mental misers: we often prefer the ease of hanging on to old views over the difficulty of grappling with new ones. Yet there are also deeper forces behind our resistance to rethinking. Questioning ourselves makes the world more unpredictable. It requires us to admit that the facts may have changed, that what was once right may now be wrong. Reconsidering something we believe deeply can threaten our identities, making it feel as if we’re losing a part of ourselves. Rethinking isn’t a struggle in every part of our lives. When it comes to our possessions, we update with fervor. We refresh our wardrobes when they go out of style and renovate our kitchens when they’re no longer in vogue. When it comes to our knowledge and opinions, though, we tend to stick to our guns. Psychologists call this seizing and freezing. We favor the comfort of conviction over the discomfort of doubt, and we let our beliefs get brittle long before our bones. We laugh at people who still use Windows 95, yet we still cling to opinions that we formed in 1995. We listen to views that make us feel good, instead of ideas that make us think hard. At some point, you’ve probably heard that if you drop a frog in a pot of scalding hot water, it will immediately leap out. But if you drop the frog in lukewarm water and gradually raise the temperature, the frog will die. It lacks the ability to rethink the situation, and doesn’t realize the threat until it’s too late. I did some research on this popular story recently and discovered a wrinkle: it isn’t true. Tossed into the scalding pot, the frog will get burned badly and may or may not escape. The frog is actually better off in the slow-boiling pot: it will leap out as soon as the water starts to get uncomfortably warm. It’s not the frogs who fail to reevaluate. It’s us. Once we hear the story and accept it as true, we rarely bother to question it. AS THE MANN GULCH WILDFIRE raced toward them, the smokejumpers had a decision to make. In an ideal world, they would have had enough time to pause, analyze the situation, and evaluate their options. With the fire raging less than 100 yards behind, there was no chance to stop and think. “On a big fire there is no time and no tree under whose shade the boss and the crew can sit and have a Platonic dialogue about a blowup,” scholar and former firefighter Norman Maclean wrote in Young Men and Fire, his award-winning chronicle of the disaster. “If Socrates had been foreman on the Mann Gulch fire, he and his crew would have been cremated while they were sitting there considering it.” Dodge didn’t survive as a result of thinking slower. He made it out alive thanks to his ability to rethink the situation faster. Twelve smokejumpers paid the ultimate price because Dodge’s behavior didn’t make sense to them. They couldn’t rethink their assumptions in time. Under acute stress, people typically revert to their automatic, well-learned responses. That’s evolutionarily adaptive—as long as you find yourself in the same kind of environment in which those reactions were necessary. If you’re a smokejumper, your well-learned response is to put out a fire, not start another one. If you’re fleeing for your life, your well-learned response is to run away from the fire, not toward it. In normal circumstances, those instincts might save your life. Dodge survived Mann Gulch because he swiftly overrode both of those responses. No one had taught Dodge to build an escape fire. He hadn’t even heard of the concept; it was pure improvisation. Later, the other two survivors testified under oath that nothing resembling an escape fire was covered in their training. Many experts had spent their entire careers studying wildfires without realizing it was possible to stay alive by burning a hole through the blaze. When I tell people about Dodge’s escape, they usually marvel at his resourcefulness under pressure. That was genius! Their astonishment quickly melts into dejection as they conclude that this kind of eureka moment is out of reach for mere mortals. I got stumped by my fourth grader’s math homework. Yet most acts of rethinking don’t require any special skill or ingenuity. Moments earlier at Mann Gulch, the smokejumpers missed another opportunity to think again—and that one was right at their fingertips. Just before Dodge started tossing matches into the grass, he ordered his crew to drop their heavy equipment. They had spent the past eight minutes racing uphill while still carrying axes, saws, shovels, and 20-pound packs. If you’re running for your life, it might seem obvious that your first move would be to drop anything that might slow you down. For firefighters, though, tools are essential to doing their jobs. Carrying and taking care of equipment is deeply ingrained in their training and experience. It wasn’t until Dodge gave his order that most of the smokejumpers set down their tools—and even then, one firefighter hung on to his shovel until a colleague took it out of his hands. If the crew had abandoned their tools sooner, would it have been enough to save them? We’ll never know for certain, but Mann Gulch wasn’t an isolated incident. Between 1990 and 1995 alone, a total of twenty-three wildland firefighters perished trying to outrace fires uphill even though dropping their heavy equipment could have made the difference between life and death. In 1994, on Storm King Mountain in Colorado, high winds caused a fire to explode across a gulch. Running uphill on rocky ground with safety in view just 200 feet away, fourteen smokejumpers and wildland firefighters—four women, ten men—lost their lives. Later, investigators calculated that without their tools and backpacks, the crew could have moved 15 to 20 percent faster. “Most would have lived had they simply dropped their gear and run for safety,” one expert wrote. Had they “dropped their packs and tools,” the U.S. Forest Service concurred, “the firefighters would have reached the top of the ridge before the fire.” It’s reasonable to assume that at first the crew might have been running on autopilot, not even aware that they were still carrying their packs and tools. “About three hundred yards up the hill,” one of the Colorado survivors testified, “I then realized I still had my saw over my shoulder!” Even after making the wise decision to ditch the 25-pound chainsaw, he wasted valuable time: “I irrationally started looking for a place to put it down where it wouldn’t get burned. . . . I remember thinking, ‘I can’t believe I’m putting down my saw.’” One of the victims was found wearing his backpack, still clutching the handle of his chainsaw. Why would so many firefighters cling to a set of tools even though letting go might save their lives? If you’re a firefighter, dropping your tools doesn’t just require you to unlearn habits and disregard instincts. Discarding your equipment means admitting failure and shedding part of your identity. You have to rethink your goal in your job—and your role in life. “Fires are not fought with bodies and bare hands, they are fought with tools that are often distinctive trademarks of firefighters,” organizational psychologist Karl Weick explains: “They are the firefighter’s reason for being deployed in the first place. . . . Dropping one’s tools creates an existential crisis. Without my tools, who am I?” Wildland fires are relatively rare. Most of our lives don’t depend on split-second decisions that force us to reimagine our tools as a source of danger and a fire as a path to safety. Yet the challenge of rethinking assumptions is surprisingly common—maybe even common to all humans. We all make the same kind of mistakes as smokejumpers and firefighters, but the consequences are less dire and therefore often go unnoticed. Our ways of thinking become habits that can weigh us down, and we don’t bother to question them until it’s too late. Expecting your squeaky brakes to keep working until they finally fail on the freeway. Believing the stock market will keep going up after analysts warn of an impending real estate bubble. Assuming your marriage is fine despite your partner’s increasing emotional distance. Feeling secure in your job even though some of your colleagues have been laid off. This book is about the value of rethinking. It’s about adopting the kind of mental flexibility that saved Wagner Dodge’s life. It’s also about succeeding where he failed: encouraging that same agility in others. You may not carry an ax or a shovel, but you do have some cognitive tools that you use regularly. They might be things you know, assumptions you make, or opinions you hold. Some of them aren’t just part of your job—they’re part of your sense of self. Consider a group of students who built what has been called Harvard’s first online social network. Before they arrived at college, they had already connected more than an eighth of the entering freshman class in an “e-group.” But once they got to Cambridge, they abandoned the network and shut it down. Five years later Mark Zuckerberg started Facebook on the same campus. From time to time, the students who created the original e-group have felt some pangs of regret. I know, because I was one of the cofounders of that group. Let’s be clear: I never would have had the vision for what Facebook became. In hindsight, though, my friends and I clearly missed a series of chances for rethinking the potential of our platform. Our first instinct was to use the e-group to make new friends for ourselves; we didn’t consider whether it would be of interest to students at other schools or in life beyond school. Our well-learned habit was to use online tools to connect with people far away; once we lived within walking distance on the same campus, we figured we no longer needed the e-group. Although one of the cofounders was studying computer science and another early member had already founded a successful tech startup, we made the flawed assumption that an online social network was a passing hobby, not a huge part of the future of the internet. Since I didn’t know how to code, I didn’t have the tools to build something more sophisticated. Launching a company wasn’t part of my identity anyway: I saw myself as a college freshman, not a budding entrepreneur. Since then, rethinking has become central to my sense of self. I’m a psychologist but I’m not a fan of Freud, I don’t have a couch in my office, and I don’t do therapy. As an organizational psychologist at Wharton, I’ve spent the past fifteen years researching and teaching evidence-based management. As an entrepreneur of data and ideas, I’ve been called by organizations like Google, Pixar, the NBA, and the Gates Foundation to help them reexamine how they design meaningful jobs, build creative teams, and shape collaborative cultures. My job is to think again about how we work, lead, and live—and enable others to do the same. I can’t think of a more vital time for rethinking. As the coronavirus pandemic unfolded, many leaders around the world were slow to rethink their assumptions—first that the virus wouldn’t affect their countries, next that it would be no deadlier than the flu, and then that it could only be transmitted by people with visible symptoms. The cost in human life is still being tallied. In the past year we’ve all had to put our mental pliability to the test. We’ve been forced to question assumptions that we had long taken for granted: That it’s safe to go to the hospital, eat in a restaurant, and hug our parents or grandparents. That live sports will always be on TV and most of us will never have to work remotely or homeschool our kids. That we can get toilet paper and hand sanitizer whenever we need them. In the midst of the pandemic, multiple acts of police brutality led many people to rethink their views on racial injustice and their roles in fighting it. The senseless deaths of three Black citizens—George Floyd, Breonna Taylor, and Ahmaud Arbery—left millions of white people realizing that just as sexism is not only a women’s issue, racism is not only an issue for people of color. As waves of protest swept the nation, across the political spectrum, support for the Black Lives Matter movement climbed nearly as much in the span of two weeks as it had in the previous two years. Many of those who had long been unwilling or unable to acknowledge it quickly came to grips with the harsh reality of systemic racism that still pervades America. Many of those who had long been silent came to reckon with their responsibility to become antiracists and act against prejudice. Despite these shared experiences, we live in an increasingly divisive time. For some people a single mention of kneeling during the national anthem is enough to end a friendship. For others a single ballot at a voting booth is enough to end a marriage. Calcified ideologies are tearing American culture apart. Even our great governing document, the U.S. Constitution, allows for amendments. What if we were quicker to make amendments to our own mental constitutions? My aim in this book is to explore how rethinking happens. I sought out the most compelling evidence and some of the world’s most skilled rethinkers. The first section focuses on opening our own minds. You’ll find out why a forward-thinking entrepreneur got trapped in the past, why a long-shot candidate for public office came to see impostor syndrome as an advantage, how a Nobel Prize–winning scientist embraces the joy of being wrong, how the world’s best forecasters update their views, and how an Oscar-winning filmmaker has productive fights. The second section examines how we can encourage other people to think again. You’ll learn how an international debate champion wins arguments and a Black musician persuades white supremacists to abandon hate. You’ll discover how a special kind of listening helped a doctor open parents’ minds about vaccines, and helped a legislator convince a Ugandan warlord to join her in peace talks. And if you’re a Yankees fan, I’m going to see if I can convince you to root for the Red Sox. The third section is about how we can create communities of lifelong learners. In social life, a lab that specializes in difficult conversations will shed light on how we can communicate better about polarizing issues like abortion and climate change. In schools, you’ll find out how educators teach kids to think again by treating classrooms like museums, approaching projects like carpenters, and rewriting time-honored textbooks. At work, you’ll explore how to build learning cultures with the first Hispanic woman in space, who took the reins at NASA to prevent accidents after space shuttle Columbia disintegrated. I close by reflecting on the importance of reconsidering our best-laid plans. It’s a lesson that firefighters have learned the hard way. In the heat of the moment, Wagner Dodge’s impulse to drop his heavy tools and take shelter in a fire of his own making made the difference between life and death. But his inventiveness wouldn’t have even been necessary if not for a deeper, more systemic failure to think again. The greatest tragedy of Mann Gulch is that a dozen smokejumpers died fighting a fire that never needed to be fought. As early as the 1880s, scientists had begun highlighting the important role that wildfires play in the life cycles of forests. Fires remove dead matter, send nutrients into the soil, and clear a path for sunlight. When fires are suppressed, forests are left too dense. The accumulation of brush, dry leaves, and twigs becomes fuel for more explosive wildfires. Yet it wasn’t until 1978 that the U.S. Forest Service put an end to its policy that every fire spotted should be extinguished by 10:00 a.m. the following day. The Mann Gulch wildfire took place in a remote area where human lives were not at risk. The smokejumpers were called in anyway because no one in their community, their organization, or their profession had done enough to question the assumption that wildfires should not be allowed to run their course. This book is an invitation to let go of knowledge and opinions that are no longer serving you well, and to anchor your sense of self in flexibility rather than consistency. If you can master the art of rethinking, I believe you’ll be better positioned for success at work and happiness in life. Thinking again can help you generate new solutions to old problems and revisit old solutions to new problems. It’s a path to learning more from the people around you and living with fewer regrets. A hallmark of wisdom is knowing when it’s time to abandon some of your most treasured tools—and some of the most cherished parts of your identity. PART I Individual Rethinking Updating Our Own Views CHAPTER 1 A Preacher, a Prosecutor, a Politician, and a Scientist Walk into Your Mind Progress is impossible without change; and those who cannot change their minds cannot change anything. —GEORGE BERNARD SHAW Y ou probably don’t recognize his name, but Mike Lazaridis has had a defining impact on your life. From an early age, it was clear that Mike was something of an electronics wizard. By the time he turned four, he was building his own record player out of Legos and rubber bands. In high school, when his teachers had broken TVs, they called Mike to fix them. In his spare time, he built a computer and designed a better buzzer for high school quiz-bowl teams, which ended up paying for his first year of college. Just months before finishing his electrical engineering degree, Mike did what so many great entrepreneurs of his era would do: he dropped out of college. It was time for this son of immigrants to make his mark on the world. Mike’s first success came when he patented a device for reading the bar codes on movie film, which was so useful in Hollywood that it won an Emmy and an Oscar for technical achievement. That was small potatoes compared to his next big invention, which made his firm the fastest-growing company on the planet. Mike’s flagship device quickly attracted a cult following, with loyal customers ranging from Bill Gates to Christina Aguilera. “It’s literally changed my life,” Oprah Winfrey gushed. “I cannot live without this.” When he arrived at the White House, President Obama refused to relinquish his to the Secret Service. Mike Lazaridis dreamed up the idea for the BlackBerry as a wireless communication device for sending and receiving emails. As of the summer of 2009, it accounted for nearly half of the U.S. smartphone market. By 2014, its market share had plummeted to less than 1 percent. When a company takes a nosedive like that, we can never pinpoint a single cause of its downfall, so we tend to anthropomorphize it: BlackBerry failed to adapt. Yet adapting to a changing environment isn’t something a company does—it’s something people do in the multitude of decisions they make every day. As the cofounder, president, and co-CEO, Mike was in charge of all the technical and product decisions on the BlackBerry. Although his thinking may have been the spark that ignited the smartphone revolution, his struggles with rethinking ended up sucking the oxygen out of his company and virtually extinguishing his invention. Where did he go wrong? Most of us take pride in our knowledge and expertise, and in staying true to our beliefs and opinions. That makes sense in a stable world, where we get rewarded for having conviction in our ideas. The problem is that we live in a rapidly changing world, where we need to spend as much time rethinking as we do thinking. Rethinking is a skill set, but it’s also a mindset. We already have many of the mental tools we need. We just have to remember to get them out of the shed and remove the rust. SECOND THOUGHTS With advances in access to information and technology, knowledge isn’t just increasing. It’s increasing at an increasing rate. In 2011, you consumed about five times as much information per day as you would have just a quarter century earlier. As of 1950, it took about fifty years for knowledge in medicine to double. By 1980, medical knowledge was doubling every seven years, and by 2010, it was doubling in half that time. The accelerating pace of change means that we need to question our beliefs more readily than ever before. This is not an easy task. As we sit with our beliefs, they tend to become more extreme and more entrenched. I’m still struggling to accept that Pluto may not be a planet. In education, after revelations in history and revolutions in science, it often takes years for a curriculum to be updated and textbooks to be revised. Researchers have recently discovered that we need to rethink widely accepted assumptions about such subjects as Cleopatra’s roots (her father was Greek, not Egyptian, and her mother’s identity is unknown); the appearance of dinosaurs (paleontologists now think some tyrannosaurs had colorful feathers on their backs); and what’s required for sight (blind people have actually trained themselves to “see”—sound waves can activate the visual cortex and create representations in the mind’s eye, much like how echolocation helps bats navigate in the dark).* Vintage records, classic cars, and antique clocks might be valuable collectibles, but outdated facts are mental fossils that are best abandoned. We’re swift to recognize when other people need to think again. We question the judgment of experts whenever we seek out a second opinion on a medical diagnosis. Unfortunately, when it comes to our own knowledge and opinions, we often favor feeling right over being right. In everyday life, we make many diagnoses of our own, ranging from whom we hire to whom we marry. We need to develop the habit of forming our own second opinions. Imagine you have a family friend who’s a financial adviser, and he recommends investing in a retirement fund that isn’t in your employer’s plan. You have another friend who’s fairly knowledgeable about investing, and he tells you that this fund is risky. What would you do? When a man named Stephen Greenspan found himself in that situation, he decided to weigh his skeptical friend’s warning against the data available. His sister had been investing in the fund for several years, and she was pleased with the results. A number of her friends had been, too; although the returns weren’t extraordinary, they were consistently in the double digits. The financial adviser was enough of a believer that he had invested his own money in the fund. Armed with that information, Greenspan decided to go forward. He made a bold move, investing nearly a third of his retirement savings in the fund. Before long, he learned that his portfolio had grown by 25 percent. Then he lost it all overnight when the fund collapsed. It was the Ponzi scheme managed by Bernie Madoff. Two decades ago my colleague Phil Tetlock discovered something peculiar. As we think and talk, we often slip into the mindsets of three different professions: preachers, prosecutors, and politicians. In each of these modes, we take on a particular identity and use a distinct set of tools. We go into preacher mode when our sacred beliefs are in jeopardy: we deliver sermons to protect and promote our ideals. We enter prosecutor mode when we recognize flaws in other people’s reasoning: we marshal arguments to prove them wrong and win our case. We shift into politician mode when we’re seeking to win over an audience: we campaign and lobby for the approval of our constituents. The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our own views. When Stephen Greenspan and his sister made the choice to invest with Bernie Madoff, it wasn’t because they relied on just one of those mental tools. All three modes together contributed to their ill-fated decision. When his sister told him about the money she and her friends had made, she was preaching about the merits of the fund. Her confidence led Greenspan to prosecute the friend who warned him against investing, deeming the friend guilty of “knee-jerk cynicism.” Greenspan was in politician mode when he let his desire for approval sway him toward a yes—the financial adviser was a family friend whom he liked and wanted to please. Any of us could have fallen into those traps. Greenspan says that he should’ve known better, though, because he happens to be an expert on gullibility. When he decided to go ahead with the investment, he had almost finished writing a book on why we get duped. Looking back, he wishes he had approached the decision with a different set of tools. He might have analyzed the fund’s strategy more systematically instead of simply trusting in the results. He could have sought out more perspectives from credible sources. He would have experimented with investing smaller amounts over a longer period of time before gambling so much of his life’s savings. That would have put him in the mode of a scientist. A DIFFERENT PAIR OF GOGGLES If you’re a scientist by trade, rethinking is fundamental to your profession. You’re paid to be constantly aware of the limits of your understanding. You’re expected to doubt what you know, be curious about what you don’t know, and update your views based on new data. In the past century alone, the application of scientific principles has led to dramatic progress. Biological scientists discovered penicillin. Rocket scientists sent us to the moon. Computer scientists built the internet. But being a scientist is not just a profession. It’s a frame of mind—a mode of thinking that differs from preaching, prosecuting, and politicking. We move into scientist mode when we’re searching for the truth: we run experiments to test hypotheses and discover knowledge. Scientific tools aren’t reserved for people with white coats and beakers, and using them doesn’t require toiling away for years with a microscope and a petri dish. Hypotheses have as much of a place in our lives as they do in the lab. Experiments can inform our daily decisions. That makes me wonder: is it possible to train people in other fields to think more like scientists, and if so, do they end up making smarter choices? Recently, a quartet of European researchers decided to find out. They ran a bold experiment with more than a hundred founders of Italian startups in technology, retail, furniture, food, health care, leisure, and machinery. Most of the founders’ businesses had yet to bring in any revenue, making it an ideal setting to investigate how teaching scientific thinking would influence the bottom line. The entrepreneurs arrived in Milan for a training program in entrepreneurship. Over the course of four months, they learned to create a business strategy, interview customers, build a minimum viable product, and then refine a prototype. What they didn’t know was that they’d been randomly assigned to either a “scientific thinking” group or a control group. The training for both groups was identical, except that one was encouraged to view startups through a scientist’s goggles. From that perspective, their strategy is a theory, customer interviews help to develop hypotheses, and their minimum viable product and prototype are experiments to test those hypotheses. Their task is to rigorously measure the results and make decisions based on whether their hypotheses are supported or refuted. Over the following year, the startups in the control group averaged under $300 in revenue. The startups in the scientific thinking group averaged over $12,000 in revenue. They brought in revenue more than twice as fast—and attracted customers sooner, too. Why? The entrepreneurs in the control group tended to stay wedded to their original strategies and products. It was too easy to preach the virtues of their past decisions, prosecute the vices of alternative options, and politick by catering to advisers who favored the existing direction. The entrepreneurs who had been taught to think like scientists, in contrast, pivoted more than twice as often. When their hypotheses weren’t supported, they knew it was time to rethink their business models. What’s surprising about these results is that we typically celebrate great entrepreneurs and leaders for being strong-minded and clear-sighted. They’re supposed to be paragons of conviction: decisive and certain. Yet evidence reveals that when business executives compete in tournaments to price products, the best strategists are actually slow and unsure. Like careful scientists, they take their time so they have the flexibility to change their minds. I’m beginning to think decisiveness is overrated . . . but I reserve the right to change my mind. Just as you don’t have to be a professional scientist to reason like one, being a professional scientist doesn’t guarantee that someone will use the tools of their training. Scientists morph into preachers when they present their pet theories as gospel and treat thoughtful critiques as sacrilege. They veer into politician terrain when they allow their views to be swayed by popularity rather than accuracy. They enter prosecutor mode when they’re hell-bent on debunking and discrediting rather than discovering. After upending physics with his theories of relativity, Einstein opposed the quantum revolution: “To punish me for my contempt of authority, Fate has made me an authority myself.” Sometimes even great scientists need to think more like scientists. Decades before becoming a smartphone pioneer, Mike Lazaridis was recognized as a science prodigy. In middle school, he made the local news for building a solar panel at the science fair and won an award for reading every science book in the public library. If you open his eighth-grade yearbook, you’ll see a cartoon showing Mike as a mad scientist, with bolts of lightning shooting out of his head. When Mike created the BlackBerry, he was thinking like a scientist. Existing devices for wireless email featured a stylus that was too slow or a keyboard that was too small. People had to clunkily forward their work emails to their mobile device in-boxes, and they took forever to download. He started generating hypotheses and sent his team of engineers off to test them. What if people could hold the device in their hands and type with their thumbs rather than their fingers? What if there was a single mailbox synchronized across devices? What if messages could be relayed through a server and appear on the device only after they were decrypted? As other companies followed BlackBerry’s lead, Mike would take their smartphones apart and study them. Nothing really impressed him until the summer of 2007, when he was stunned by the computing power inside the first iPhone. “They’ve put a Mac in this thing,” he said. What Mike did next might have been the beginning of the end for the BlackBerry. If the BlackBerry’s rise was due in large part to his success in scientific thinking as an engineer, its demise was in many ways the result of his failure in rethinking as a CEO. As the iPhone skyrocketed onto the scene, Mike maintained his belief in the features that had made the BlackBerry a sensation in the past. He was confident that people wanted a wireless device for work emails and calls, not an entire computer in their pocket with apps for home entertainment. As early as 1997, one of his top engineers wanted to add an internet browser, but Mike told him to focus only on email. A decade later, Mike was still certain that a powerful internet browser would drain the battery and strain the bandwidth of wireless networks. He didn’t test the alternative hypotheses. By 2008, the company’s valuation exceeded $70 billion, but the BlackBerry remained the company’s sole product, and it still lacked a reliable browser. In 2010, when his colleagues pitched a strategy to feature encrypted text messages, Mike was receptive but expressed concerns that allowing messages to be exchanged on competitors’ devices would render the BlackBerry obsolete. As his reservations gained traction within the firm, the company abandoned instant messaging, missing an opportunity that WhatsApp later seized for upwards of $19 billion. As gifted as Mike was at rethinking the design of electronic devices, he wasn’t willing to rethink the market for his baby. Intelligence was no cure—it might have been more of a curse. THE SMARTER THEY ARE, THE HARDER THEY FAIL Mental horsepower doesn’t guarantee mental dexterity. No matter how much brainpower you have, if you lack the motivation to change your mind, you’ll miss many occasions to think again. Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs. One study investigated whether being a math whiz makes you better at analyzing data. The answer is yes—if you’re told the data are about something bland, like a treatment for skin rashes. But what if the exact same data are labeled as focusing on an ideological issue that activates strong emotions—like gun laws in the United States? Being a quant jock makes you more accurate in interpreting the results—as long as they support your beliefs. Yet if the empirical pattern clashes with your ideology, math prowess is no longer an asset; it actually becomes a liability. The better you are at crunching numbers, the more spectacularly you fail at analyzing patterns that contradict your views. If they were liberals, math geniuses did worse than their peers at evaluating evidence that gun bans failed. If they were conservatives, they did worse at assessing evidence that gun bans worked. In psychology there are at least two biases that drive this pattern. One is confirmation bias: seeing what we expect to see. The other is desirability bias: seeing what we want to see. These biases don’t just prevent us from applying our intelligence. They can actually contort our intelligence into a weapon against the truth. We find reasons to preach our faith more deeply, prosecute our case more passionately, and ride the tidal wave of our political party. The tragedy is that we’re usually unaware of the resulting flaws in our thinking. My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap. The brighter you are, the harder it can be to see your own limitations. Being good at thinking can make you worse at rethinking. When we’re in scientist mode, we refuse to let our ideas become ideologies. We don’t start with answers or solutions; we lead with questions and puzzles. We don’t preach from intuition; we teach from evidence. We don’t just have healthy skepticism about other people’s arguments; we dare to disagree with our own arguments. Thinking like a scientist involves more than just reacting with an open mind. It means being actively open-minded. It requires searching for reasons why we might be wrong—not for reasons why we must be right—and revising our views based on what we learn. That rarely happens in the other mental modes. In preacher mode, changing our minds is a mark of moral weakness; in scientist mode, it’s a sign of intellectual integrity. In prosecutor mode, allowing ourselves to be persuaded is admitting defeat; in scientist mode, it’s a step toward the truth. In politician mode, we flip-flop in response to carrots and sticks; in scientist mode, we shift in the face of sharper logic and stronger data. I’ve done my best to write this book in scientist mode.* I’m a teacher, not a preacher. I can’t stand politics, and I hope a decade as a tenured professor has cured me of whatever temptation I once felt to appease my audience. Although I’ve spent more than my share of time in prosecutor mode, I’ve decided that in a courtroom I’d rather be the judge. I don’t expect you to agree with everything I think. My hope is that you’ll be intrigued by how I think—and that the studies, stories, and ideas covered here will lead you to do some rethinking of your own. After all, the purpose of learning isn’t to affirm our beliefs; it’s to evolve our beliefs. One of my beliefs is that we shouldn’t be open-minded in every circumstance. There are situations where it might make sense to preach, prosecute, and politick. That said, I think most of us would benefit from being more open more of the time, because it’s in scientist mode that we gain mental agility. When psychologist Mihaly Csikszentmihalyi studied eminent scientists like Linus Pauling and Jonas Salk, he concluded that what differentiated them from their peers was their cognitive flexibility, their willingness “to move from one extreme to the other as the occasion requires.” The same pattern held for great artists, and in an independent study of highly creative architects. We can even see it in the Oval Office. Experts assessed American presidents on a long list of personality traits and compared them to rankings by independent historians and political scientists. Only one trait consistently predicted presidential greatness after controlling for factors like years in office, wars, and scandals. It wasn’t whether presidents were ambitious or forceful, friendly or Machiavellian; it wasn’t whether they were attractive, witty, poised, or polished. What set great presidents apart was their intellectual curiosity and openness. They read widely and were as eager to learn about developments in biology, philosophy, architecture, and music as in domestic and foreign affairs. They were interested in hearing new views and revising their old ones. They saw many of their policies as experiments to run, not points to score. Although they might have been politicians by profession, they often solved problems like scientists. DON’T STOP UNBELIEVING As I’ve studied the process of rethinking, I’ve found that it often unfolds in a cycle. It starts with intellectual humility—knowing what we don’t know. We should all be able to make a long list of areas where we’re ignorant. Mine include art, financial markets, fashion, chemistry, food, why British accents turn American in songs, and why it’s impossible to tickle yourself. Recognizing our shortcomings opens the door to doubt. As we question our current understanding, we become curious about what information we’re missing. That search leads us to new discoveries, which in turn maintain our humility by reinforcing how much we still have to learn. If knowledge is power, knowing what we don’t know is wisdom. Scientific thinking favors humility over pride, doubt over certainty, curiosity over closure. When we shift out of scientist mode, the rethinking cycle breaks down, giving way to an overconfidence cycle. If we’re preaching, we can’t see gaps in our knowledge: we believe we’ve already found the truth. Pride breeds conviction rather than doubt, which makes us prosecutors: we might be laser-focused on changing other people’s minds, but ours is set in stone. That launches us into confirmation bias and desirability bias. We become politicians, ignoring or dismissing whatever doesn’t win the favor of our constituents—our parents, our bosses, or the high school classmates we’re still trying to impress. We become so busy putting on a show that the truth gets relegated to a backstage seat, and the resulting validation can make us arrogant. We fall victim to the fat-cat syndrome, resting on our laurels instead of pressure-testing our beliefs. In the case of the BlackBerry, Mike Lazaridis was trapped in an overconfidence cycle. Taking pride in his successful invention gave him too much conviction. Nowhere was that clearer than in his preference for the keyboard over a touchscreen. It was a BlackBerry virtue he loved to preach—and an Apple vice he was quick to prosecute. As his company’s stock fell, Mike got caught up in confirmation bias and desirability bias, and fell victim to validation from fans. “It’s an iconic product,” he said of the BlackBerry in 2011. “It’s used by business, it’s used by leaders, it’s used by celebrities.” By 2012, the iPhone had captured a quarter of the global smartphone market, but Mike was still resisting the idea of typing on glass. “I don’t get this,” he said at a board meeting, pointing at a phone with a touchscreen. “The keyboard is one of the reasons they buy BlackBerrys.” Like a politician who campaigns only to his base, he focused on the keyboard taste of millions of existing users, neglecting the appeal of a touchscreen to billions of potential users. For the record, I still miss the keyboard, and I’m excited that it’s been licensed for an attempted comeback. When Mike finally started reimagining the screen and software, some of his engineers didn’t want to abandon their past work. The failure to rethink was widespread. In 2011, an anonymous high-level employee inside the firm wrote an open letter to Mike and his co-CEO. “We laughed and said they are trying to put a computer on a phone, that it won’t work,” the letter read. “We are now 3–4 years too late.” Our convictions can lock us in prisons of our own making. The solution is not to decelerate our thinking—it’s to accelerate our rethinking. That’s what resurrected Apple from the brink of bankruptcy to become the world’s most valuable company. The legend of Apple’s renaissance revolves around the lone genius of Steve Jobs. It was his conviction and clarity of vision, the story goes, that gave birth to the iPhone. The reality is that he was dead-set against the mobile phone category. His employees had the vision for it, and it was their ability to change his mind that really revived Apple. Although Jobs knew how to “think different,” it was his team that did much of the rethinking. In 2004, a small group of engineers, designers, and marketers pitched Jobs on turning their hit product, the iPod, into a phone. “Why the f@*and would we want to do that?” Jobs snapped. “That is the dumbest idea I’ve ever heard.” The team had recognized that mobile phones were starting to feature the ability to play music, but Jobs was worried about cannibalizing Apple’s thriving iPod business. He hated cell-phone companies and didn’t want to design products within the constraints that carriers imposed. When his calls dropped or the software crashed, he would sometimes smash his phone to pieces in frustration. In private meetings and on public stages, he swore over and over that he would never make a phone. Yet some of Apple’s engineers were already doing research in that area. They worked together to persuade Jobs that he didn’t know what he didn’t know and urged him to doubt his convictions. It might be possible, they argued, to build a smartphone that everyone would love using—and to get the carriers to do it Apple’s way. Research shows that when people are resistant to change, it helps to reinforce what will stay the same. Visions for change are more compelling when they include visions of continuity. Although our strategy might evolve, our identity will endure. The engineers who worked closely with Jobs understood that this was one of the best ways to convince him. They assured him that they weren’t trying to turn Apple into a phone company. It would remain a computer company—they were just taking their existing products and adding a phone on the side. Apple was already putting twenty thousand songs in your pocket, so why wouldn’t they put everything else in your pocket, too? They needed to rethink their technology, but they would preserve their DNA. After six months of discussion, Jobs finally became curious enough to give the effort his blessing, and two different teams were off to the races in an experiment to test whether they should add calling capabilities to the iPod or turn the Mac into a miniature tablet that doubled as a phone. Just four years after it launched, the iPhone accounted for half of Apple’s revenue. The iPhone represented a dramatic leap in rethinking the smartphone. Since its inception, smartphone innovation has been much more incremental, with different sizes and shapes, better cameras, and longer battery life, but few fundamental changes to the purpose or user experience. Looking back, if Mike Lazaridis had been more open to rethinking his pet product, would BlackBerry and Apple have compelled each other to reimagine the smartphone multiple times by now? The curse of knowledge is that it closes our minds to what we don’t know. Good judgment depends on having the skill—and the will—to open our minds. I’m pretty confident that in life, rethinking is an increasingly important habit. Of course, I might be wrong. If I am, I’ll be quick to think again. CHAPTER 2 The Armchair Quarterback and the Impostor Finding the Sweet Spot of Confidence Ignorance more frequently begets confidence than does knowledge. —CHARLES DARWIN W hen Ursula Mercz was admitted to the clinic, she complained of headaches, back pain, and dizziness severe enough that she could no longer work. Over the following month her condition deteriorated. She struggled to locate the glass of water she put next to her bed. She couldn’t find the door to her room. She walked directly into her bed frame. Ursula was a seamstress in her midfifties, and she hadn’t lost her dexterity: she was able to cut different shapes out of paper with scissors. She could easily point to her nose, mouth, arms, and legs, and had no difficulty describing her home and her pets. For an Austrian doctor named Gabriel Anton, she presented a curious case. When Anton put a red ribbon and scissors on the table in front of her, she couldn’t name them, even though “she confirmed, calmly and faithfully, that she could see the presented objects.” She was clearly having problems with language production, which she acknowledged, and with spatial orientation. Yet something else was wrong: Ursula could no longer tell the difference between light and dark. When Anton held up an object and asked her to describe it, she didn’t even try to look at it but instead reached out to touch it. Tests showed that her eyesight was severely impaired. Oddly, when Anton asked her about the deficit, she insisted she could see. Eventually, when she lost her vision altogether, she remained completely unaware of it. “It was now extremely astonishing,” Anton wrote, “that the patient did not notice her massive and later complete loss of her ability to see . . . she was mentally blind to her blindness.” It was the late 1800s, and Ursula wasn’t alone. A decade earlier a neuropathologist in Zurich had reported a case of a man who suffered an accident that left him blind but was unaware of it despite being “intellectually unimpaired.” Although he didn’t blink when a fist was placed in front of his face and couldn’t see the food on his plate, “he thought he was in a dark humid hole or cellar.” Half a century later, a pair of doctors reported six cases of people who had gone blind but claimed otherwise. “One of the most striking features in the behavior of our patients was their inability to learn from their experiences,” the doctors wrote: As they were not aware of their blindness when they walked about, they bumped into the furniture and walls but did not change their behavior. When confronted with their blindness in a rather pointed fashion, they would either deny any visual difficulty or remark: “It is so dark in the room; why don’t they turn the light on?”; “I forgot my glasses,” or “My vision is not too good, but I can see all right.” The patients would not accept any demonstration or assurance which would prove their blindness. This phenomenon was first described by the Roman philosopher Seneca, who wrote of a woman who was blind but complained that she was simply in a dark room. It’s now accepted in the medical literature as Anton’s syndrome—a deficit of self-awareness in which a person is oblivious to a physical disability but otherwise doing fairly well cognitively. It’s known to be caused by damage to the occipital lobe of the brain. Yet I’ve come to believe that even when our brains are functioning normally, we’re all vulnerable to a version of Anton’s syndrome. We all have blind spots in our knowledge and opinions. The bad news is that they can leave us blind to our blindness, which gives us false confidence in our judgment and prevents us from rethinking. The good news is that with the right kind of confidence, we can learn to see ourselves more clearly and update our views. In driver’s training we were taught to identify our visual blind spots and eliminate them with the help of mirrors and sensors. In life, since our minds don’t come equipped with those tools, we need to learn to recognize our cognitive blind spots and revise our thinking accordingly. A TALE OF TWO SYNDROMES On the first day of December 2015, Halla T?masd?ttir got a call she never expected. The roof of Halla’s house had just given way to a thick layer of snow and ice. As she watched water pouring down one of the walls, the friend on the other end of the line asked if Halla had seen the Facebook posts about her. Someone had started a petition for Halla to run for the presidency of Iceland. Halla’s first thought was, Who am I to be president? She had helped start a university and then cofounded an investment firm in 2007. When the 2008 financial crisis rocked the world, Iceland was hit particularly hard; all three of its major private commercial banks defaulted and its currency collapsed. Relative to the size of its economy, the country faced the worst financial meltdown in human history, but Halla demonstrated her leadership skills by guiding her firm successfully through the crisis. Even with that accomplishment, she didn’t feel prepared for the presidency. She had no political background; she had never served in government or in any kind of public-sector role. It wasn’t the first time Halla had felt like an impostor. At the age of eight, her piano teacher had placed her on a fast track and frequently asked her to play in concerts, but she never felt she was worthy of the honor—and so, before every concert, she felt sick. Although the stakes were much higher now, the self-doubt felt familiar. “I had a massive pit in my stomach, like the piano recital but much bigger,” Halla told me. “It’s the worst case of adult impostor syndrome I’ve ever had.” For months, she struggled with the idea of becoming a candidate. As her friends and family encouraged her to recognize that she had some relevant skills, Halla was still convinced that she lacked the necessary experience and confidence. She tried to persuade other women to run—one of whom ended up ascending to a different office, as the prime minister of Iceland. Yet the petition didn’t go away, and Halla’s friends, family, and colleagues didn’t stop urging her on. Eventually, she found herself asking, Who am I not to serve? She ultimately decided to go for it, but the odds were heavily stacked against her. She was running as an unknown independent candidate in a field of more than twenty contenders. One of her competitors was particularly powerful—and particularly dangerous. When an economist was asked to name the three people most responsible for Iceland’s bankruptcy, she nominated Dav?? Oddsson for all three spots. As Iceland’s prime minister from 1991 to 2004, Oddsson put the country’s banks in jeopardy by privatizing them. Then, as governor of Iceland’s central bank from 2005 to 2009, he allowed the banks’ balance sheets to balloon to more than ten times the national GDP. When the people protested his mismanagement, Oddsson refused to resign and had to be forced out by Parliament. Time magazine later identified him as one of the twenty-five people to blame for the financial crisis worldwide. Nevertheless, in 2016 Oddsson announced his candidacy for the presidency of Iceland: “My experience and knowledge, which is considerable, could go well with this office.” In theory, confidence and competence go hand in hand. In practice, they often diverge. You can see it when people rate their own leadership skills and are also evaluated by their colleagues, supervisors, or subordinates. In a meta-analysis of ninety-five studies involving over a hundred thousand people, women typically underestimated their leadership skills, while men overestimated their skills. You’ve probably met some football fans who are convinced they know more than the coaches on the sidelines. That’s the armchair quarterback syndrome, where confidence exceeds competence. Even after calling financial plays that destroyed an economy, Dav?? Oddsson still refused to acknowledge that he wasn’t qualified to coach—let alone quarterback. He was blind to his weaknesses. Jason Adam Katzenstein/The New Yorker Collection/The Cartoon Bank; © Cond? Nast The opposite of armchair quarterback syndrome is impostor syndrome, where competence exceeds confidence. Think of the people you know who believe that they don’t deserve their success. They’re genuinely unaware of just how intelligent, creative, or charming they are, and no matter how hard you try, you can’t get them to rethink their views. Even after an online petition proved that many others had confidence in her, Halla T?masd?ttir still wasn’t convinced she was qualified to lead her country. She was blind to her strengths. Although they had opposite blind spots, being on the extremes of confidence left both candidates reluctant to rethink their plans. The ideal level of confidence probably lies somewhere between being an armchair quarterback and an impostor. How do we find that sweet spot? THE IGNORANCE OF ARROGANCE One of my favorite accolades is a satirical award for research that’s as entertaining as it is enlightening. It’s called the Ig™ Nobel Prize, and it’s handed out by actual Nobel laureates. One autumn in college, I raced to the campus theater to watch the ceremony along with over a thousand fellow nerds. The winners included a pair of physicists who created a magnetic field to levitate a live frog, a trio of chemists who discovered that the biochemistry of romantic love has something in common with obsessive-compulsive disorder, and a computer scientist who invented PawSense—software that detects cat paws on a keyboard and makes an annoying noise to deter them. Unclear whether it also worked with dogs. Several of the awards made me laugh, but the honorees who made me think the most were two psychologists, David Dunning and Justin Kruger. They had just published a “modest report” on skill and confidence that would soon become famous. They found that in many situations, those who can’t . . . don’t know they can’t. According to what’s now known as the Dunning-Kruger effect, it’s when we lack competence that we’re most likely to be brimming with overconfidence. In the original Dunning-Kruger studies, people who scored the lowest on tests of logical reasoning, grammar, and sense of humor had the most inflated opinions of their skills. On average, they believed they did better than 62 percent of their peers, but in reality outperformed only 12 percent of them. The less intelligent we are in a particular domain, the more we seem to overestimate our actual intelligence in that domain. In a group of football fans, the one who knows the least is the most likely to be the armchair quarterback, prosecuting the coach for calling the wrong play and preaching about a better playbook. This tendency matters because it compromises self-awareness, and it trips us up across all kinds of settings. Look what happened when economists evaluated the operations and management practices of thousands of companies across a wide range of industries and countries, and compared their assessments with managers’ self-ratings: Sources: World Management Survey; Bloom and Van Reenen 2007; and Maloney 2017b. In this graph, if self-assessments of performance matched actual performance, every country would be on the dotted line. Overconfidence existed in every culture, and it was most rampant where management was the poorest.* Of course, management skills can be hard to judge objectively. Knowledge should be easier—you were tested on yours throughout school. Compared to most people, how much do you think you know about each of the following topics—more, less, or the same? •Why English became the official language of the United States •Why women were burned at the stake in Salem •What job Walt Disney had before he drew Mickey Mouse •On which spaceflight humans first laid eyes on the Great Wall of China •Why eating candy affects how kids behave One of my biggest pet peeves is feigned knowledge, where people pretend to know things they don’t. It bothers me so much that at this very moment I’m writing an entire book about it. In a series of studies, people rated whether they knew more or less than most people about a range of topics like these, and then took a quiz to test their actual knowledge. The more superior participants thought their knowledge was, the more they overestimated themselves—and the less interested they were in learning and updating. If you think you know more about history or science than most people, chances are you know less than you think. As Dunning quips, “The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.”* On the questions above, if you felt you knew anything at all, think again. America has no official language, suspected witches were hanged in Salem but not burned, Walt Disney didn’t draw Mickey Mouse (it was the work of an animator named Ub Iwerks), you can’t actually see the Great Wall of China from space, and the average effect of sugar on children’s behavior is zero. Although the Dunning-Kruger effect is often amusing in everyday life, it was no laughing matter in Iceland. Despite serving as governor of the central bank, Dav?? Oddsson had no training in finance or economics. Before entering politics, he had created a radio comedy show, written plays and short stories, gone to law school, and worked as a journalist. During his reign as Iceland’s prime minister, Oddsson was so dismissive of experts that he disbanded the National Economic Institute. To force him out of his post at the central bank, Parliament had to pass an unconventional law: any governor would have to have at least a master’s degree in economics. That didn’t stop Oddsson from running for president a few years later. He seemed utterly blind to his blindness: he didn’t know what he didn’t know. STRANDED AT THE SUMMIT OF MOUNT STUPID The problem with armchair quarterback syndrome is that it stands in the way of rethinking. If we’re certain that we know something, we have no reason to look for gaps and flaws in our knowledge—let alone fill or correct them. In one study, the people who scored the lowest on an emotional intelligence test weren’t just the most likely to overestimate their skills. They were also the most likely to dismiss their scores as inaccurate or irrelevant—and the least likely to invest in coaching or self-improvement. Yes, some of this comes down to our fragile egos. We’re driven to deny our weaknesses when we want to see ourselves in a positive light or paint a glowing picture of ourselves to others. A classic case is the crooked politician who claims to crusade against corruption, but is actually motivated by willful blindness or social deception. Yet motivation is only part of the story.* There’s a less obvious force that clouds our vision of our abilities: a deficit in metacognitive skill, the ability to think about our thinking. Lacking competence can leave us blind to our own incompetence. If you’re a tech entrepreneur and you’re uninformed about education systems, you can feel certain that your master plan will fix them. If you’re socially awkward and you’re missing some insight on social graces, you can strut around believing you’re James Bond. In high school, a friend told me I didn’t have a sense of humor. What made her think that? “You don’t laugh at all my jokes.” I’m hilarious . . . said no funny person ever. I’ll leave it to you to decide who lacked the sense of humor. When we lack the knowledge and skills to achieve excellence, we sometimes lack the knowledge and skills to judge excellence. This insight should immediately put your favorite confident ignoramuses in their place. Before we poke fun at them, though, it’s worth remembering that we all have moments when we are them. We’re all novices at many things, but we’re not always blind to that fact. We tend to overestimate ourselves on desirable skills, like the ability to carry on a riveting conversation. We’re also prone to overconfidence in situations where it’s easy to confuse experience for expertise, like driving, typing, trivia, and managing emotions. Yet we underestimate ourselves when we can easily recognize that we lack experience—like painting, driving a race car, and rapidly reciting the alphabet backward. Absolute beginners rarely fall into the Dunning-Kruger trap. If you don’t know a thing about football, you probably don’t walk around believing you know more than the coach. It’s when we progress from novice to amateur that we become overconfident. A bit of knowledge can be a dangerous thing. In too many domains of our lives, we never gain enough expertise to question our opinions or discover what we don’t know. We have just enough information to feel self-assured about making pronouncements and passing judgment, failing to realize that we’ve climbed to the top of Mount Stupid without making it over to the other side. You can see this phenomenon in one of Dunning’s experiments that involved people playing the role of doctors in a simulated zombie apocalypse. When they’ve seen only a handful of injured victims, their perceived and actual skills match. Unfortunately, as they gain experience, their confidence climbs faster than their competence, and confidence remains higher than competence from that point on. This might be one of the reasons that patient mortality rates in hospitals seem to spike in July, when new residents take over. It’s not their lack of skill alone that proves hazardous; it’s their overestimation of that skill. Advancing from novice to amateur can break the rethinking cycle. As we gain experience, we lose some of our humility. We take pride in making rapid progress, which promotes a false sense of mastery. That jump-starts an overconfidence cycle, preventing us from doubting what we know and being curious about what we don’t. We get trapped in a beginner’s bubble of flawed assumptions, where we’re ignorant of our own ignorance. That’s what happened in Iceland to Dav?? Oddsson, whose arrogance was reinforced by cronies and unchecked by critics. He was known to surround himself with “fiercely loyal henchmen” from school and bridge matches, and to keep a checklist of friends and enemies. Months before the meltdown, Oddsson refused help from England’s central bank. Then, at the height of the crisis, he brashly declared in public that he had no intention of covering the debts of Iceland’s banks. Two years later an independent truth commission appointed by Parliament charged him with gross negligence. Oddsson’s downfall, according to one journalist who chronicled Iceland’s financial collapse, was “arrogance, his absolute conviction that he knew what was best for the island.” What he lacked is a crucial nutrient for the mind: humility. The antidote to getting stuck on Mount Stupid is taking a regular dose of it. “Arrogance is ignorance plus conviction,” blogger Tim Urban explains. “While humility is a permeable filter that absorbs life experience and converts it into knowledge and wisdom, arrogance is a rubber shield that life experience simply bounces off of.” WHAT GOLDILOCKS GOT WRONG Many people picture confidence as a seesaw. Gain too much confidence, and we tip toward arrogance. Lose too much confidence, and we become meek. This is our fear with humility: that we’ll end up having a low opinion of ourselves. We want to keep the seesaw balanced, so we go into Goldilocks mode and look for the amount of confidence that’s just right. Recently, though, I learned that this is the wrong approach. Humility is often misunderstood. It’s not a matter of having low self-confidence. One of the Latin roots of humility means “from the earth.” It’s about being grounded—recognizing that we’re flawed and fallible. Confidence is a measure of how much you believe in yourself. Evidence shows that’s distinct from how much you believe in your methods. You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. That’s the sweet spot of confidence. We become blinded by arrogance when we’re utterly convinced of our strengths and our strategies. We get paralyzed by doubt when we lack conviction in both. We can be consumed by an inferiority complex when we know the right method but feel uncertain about our ability to execute it. What we want to attain is confident humility: having faith in our capability while appreciating that we may not have the right solution or even be addressing the right problem. That gives us enough doubt to reexamine our old knowledge and enough confidence to pursue new insights. When Spanx founder Sara Blakely had the idea for footless pantyhose, she believed in her ability to make the idea a reality, but she was full of doubt about her current tools. Her day job was selling fax machines door-to-door, and she was aware that she didn’t know anything about fashion, retail, or manufacturing. When she was designing the prototype, she spent a week driving around to hosiery mills to ask them for help. When she couldn’t afford a law firm to apply for a patent, she read a book on the topic and filled out the application herself. Her doubt wasn’t debilitating—she was confident she could overcome the challenges in front of her. Her confidence wasn’t in her existing knowledge—it was in her capacity to learn. Confident humility can be taught. In one experiment, when students read a short article about the benefits of admitting what we don’t know rather than being certain about it, their odds of seeking extra help in an area of weakness spiked from 65 to 85 percent. They were also more likely to explore opposing political views to try to learn from the other side. Confident humility doesn’t just open our minds to rethinking—it improves the quality of our rethinking. In college and graduate school, students who are willing to revise their beliefs get higher grades than their peers. In high school, students who admit when they don’t know something are rated by teachers as learning more effectively and by peers as contributing more to their teams. At the end of the academic year, they have significantly higher math grades than their more self-assured peers. Instead of just assuming they’ve mastered the material, they quiz themselves to test their understanding. When adults have the confidence to acknowledge what they don’t know, they pay more attention to how strong evidence is and spend more time reading material that contradicts their opinions. In rigorous studies of leadership effectiveness across the United States and China, the most productive and innovative teams aren’t run by leaders who are confident or humble. The most effective leaders score high in both confidence and humility. Although they have faith in their strengths, they’re also keenly aware of their weaknesses. They know they need to recognize and transcend their limits if they want to push the limits of greatness. If we care about accuracy, we can’t afford to have blind spots. To get an accurate picture of our knowledge and skills, it can help to assess ourselves like scientists looking through a microscope. But one of my newly formed beliefs is that we’re sometimes better off underestimating ourselves. THE BENEFITS OF DOUBT Just a month and a half before Iceland’s presidential election, Halla T?masd?ttir was polling at only 1 percent support. To focus on the most promising candidates, the network airing the first televised debate announced that they wouldn’t feature anyone with less than 2.5 percent of the vote. On the day of the debate, Halla ended up barely squeaking through. Over the following month her popularity skyrocketed. She wasn’t just a viable candidate; she was in the final four. A few years later, when I invited her to speak to my class, Halla mentioned that the psychological fuel that propelled her meteoric rise was none other than impostor syndrome. Feeling like an impostor is typically viewed as a bad thing, and for good reason—a chronic sense of being unworthy can breed misery, crush motivation, and hold us back from pursuing our ambitions. From time to time, though, a less crippling sense of doubt waltzes into many of our minds. Some surveys suggest that more than half the people you know have felt like impostors at some point in their careers. It’s thought to be especially common among women and marginalized groups. Strangely, it also seems to be particularly pronounced among high achievers. I’ve taught students who earned patents before they could drink and became chess masters before they could drive, but these same individuals still wrestle with insecurity and constantly question their abilities. The standard explanation for their accomplishments is that they succeed in spite of their doubts, but what if their success is actually driven in part by those doubts? To find out, Basima Tewfik—then a doctoral student at Wharton, now an MIT professor—recruited a group of medical students who were preparing to begin their clinical rotations. She had them interact for more than half an hour with actors who had been trained to play the role of patients presenting symptoms of various diseases. Basima observed how the medical students treated the patients—and also tracked whether they made the right diagnoses. A week earlier the students had answered a survey about how often they entertained impostor thoughts like I am not as qualified as others think I am and People important to me think I am more capable than I think I am. Those who self-identified as impostors didn’t do any worse in their diagnoses, and they did significantly better when it came to bedside manner—they were rated as more empathetic, respectful, and professional, as well as more effective in asking questions and sharing information. In another study, Basima found a similar pattern with investment professionals: the more often they felt like impostors, the higher their performance reviews from their supervisors four months later. This evidence is new, and we still have a lot to learn about when impostor syndrome is beneficial versus when it’s detrimental. Still, it leaves me wondering if we’ve been misjudging impostor syndrome by seeing it solely as a disorder. When our impostor fears crop up, the usual advice is to ignore them—give ourselves the benefit of the doubt. Instead, we might be better off embracing those fears, because they can give us three benefits of doubt. The first upside of feeling like an impostor is that it can motivate us to work harder. It’s probably not helpful when we’re deciding whether to start a race, but once we’ve stepped up to the starting line, it gives us the drive to keep running to the end so that we can earn our place among the finalists.* In some of my own research across call centers, military and government teams, and nonprofits, I’ve found that confidence can make us complacent. If we never worry about letting other people down, we’re more likely to actually do so. When we feel like impostors, we think we have something to prove. Impostors may be the last to jump in, but they may also be the last to bail out. Second, impostor thoughts can motivate us to work smarter. When we don’t believe we’re going to win, we have nothing to lose by rethinking our strategy. Remember that total beginners don’t fall victim to the Dunning-Kruger effect. Feeling like an impostor puts us in a beginner’s mindset, leading us to question assumptions that others have taken for granted. Third, feeling like an impostor can make us better learners. Having some doubts about our knowledge and skills takes us off a pedestal, encouraging us to seek out insights from others. As psychologist Elizabeth Krumrei Mancuso and her colleagues write, “Learning requires the humility to realize one has something to learn.” Some evidence on this dynamic comes from a study by another of our former doctoral students at Wharton, Danielle Tussing—now a professor at SUNY Buffalo. Danielle gathered her data in a hospital where the leadership role of charge nurse is rotated between shifts, which means that nurses end up at the helm even if they have doubts about their capabilities. Nurses who felt some hesitations about assuming the mantle were actually more effective leaders, in part because they were more likely to seek out second opinions from colleagues. They saw themselves on a level playing field, and they knew that much of what they lacked in experience and expertise they could make up by listening. There’s no clearer case of that than Halla T?masd?ttir. THE LEAGUE OF EXTRAORDINARY HUMILITY When I sat down with Halla, she told me that in the past her doubts had been debilitating. She took them as a sign that she lacked the ability to succeed. Now she had reached a point of confident humility, and she interpreted doubts differently: they were a cue that she needed to improve her tools. Plenty of evidence suggests that confidence is just as often the result of progress as the cause of it. We don’t have to wait for our confidence to rise to achieve challenging goals. We can build it through achieving challenging goals. “I have come to welcome impostor syndrome as a good thing: it’s fuel to do more, try more,” Halla says. “I’ve learned to use it to my advantage. I actually thrive on the growth that comes from the self-doubt.” While other candidates were content to rely on the usual media coverage, Halla’s uncertainty about her tools made her eager to rethink the way campaigns were run. She worked harder and smarter, staying up late to personally answer social media messages. She held Facebook Live sessions where voters could ask her anything, and learned to use Snapchat to reach young people. Deciding she had nothing to lose, she went where few presidential candidates had gone before: instead of prosecuting her opponents, she ran a positive campaign. How much worse can it get? she thought. It was part of why she resonated so strongly with voters: they were tired of watching candidates smear one another and delighted to see a candidate treat her competitors with respect. Uncertainty primes us to ask questions and absorb new ideas. It protects us against the Dunning-Kruger effect. “Impostor syndrome always keeps me on my toes and growing because I never think I know it all,” Halla reflects, sounding more like a scientist than a politician. “Maybe impostor syndrome is needed for change. Impostors rarely say, ‘This is how we do things around here.’ They don’t say, ‘This is the right way.’ I was so eager to learn and grow that I asked everyone for advice on how I could do things differently.” Although she doubted her tools, she had confidence in herself as a learner. She understood that knowledge is best sought from experts, but creativity and wisdom can come from anywhere. Iceland’s presidential election came down to Halla, Dav?? Oddsson, and two other men. The three men all enjoyed more media coverage than Halla throughout the campaign, including front-page interviews, which she never received. They also had bigger campaign budgets. Yet on election day, Halla stunned her country—and herself—by winning more than a quarter of the vote. She didn’t land the presidency; she came in second. Her 28 percent fell shy of the victor’s 39 percent. But Halla trounced Dav?? Oddsson, who finished fourth, with less than 14 percent. Based on her trajectory and momentum, it’s not crazy to imagine that with a few more weeks, she could have won. Great thinkers don’t harbor doubts because they’re impostors. They maintain doubts because they know we’re all partially blind and they’re committed to improving their sight. They don’t boast about how much they know; they marvel at how little they understand. They’re aware that each answer raises new questions, and the quest for knowledge is never finished. A mark of lifelong learners is recognizing that they can learn something from everyone they meet. Arrogance leaves us blind to our weaknesses. Humility is a reflective lens: it helps us see them clearly. Confident humility is a corrective lens: it enables us to overcome those weaknesses. CHAPTER 3 The Joy of Being Wrong The Thrill of Not Believing Everything You Think I have a degree from Harvard. Whenever I’m wrong, the world makes a little less sense. —DR. FRASIER CRANE, PLAYED BY KELSEY GRAMMER I n the fall of 1959, a prominent psychologist welcomed new participants into a wildly unethical study. He had handpicked a group of Harvard sophomores to join a series of experiments that would run through the rest of their time in college. The students volunteered to spend a couple of hours a week contributing to knowledge about how personality develops and how psychological problems can be solved. They had no idea that they were actually signing up to have their beliefs attacked. The researcher, Henry Murray, had originally trained as a physician and biochemist. After becoming a distinguished psychologist, he was disillusioned that his field paid little attention to how people navigate difficult interactions, so he decided to create them in his own lab. He gave students a month to write out their personal philosophy of life, including their core values and guiding principles. When they showed up to submit their work, they were paired with another student who had done the same exercise. They would have a day or two to read each other’s philosophies, and then they would be filmed debating them. The experience would be much more intense than they anticipated. Murray modeled the study on psychological assessments he had developed for spies in World War II. As a lieutenant colonel, Murray had been recruited to vet potential agents for the Office of Strategic Services, the precursor to the CIA. To gauge how candidates would handle pressure, he sent them down to a basement to be interrogated with a bright light shining in their faces. The examiner would wait for an inconsistency in their accounts to pop up and then scream, “You’re a liar!” Some candidates quit on the spot; others were reduced to tears. Those who withstood the onslaught got the gig. Now Murray was ready for a more systematic study of reactions to stress. He had carefully screened students to create a sample that included a wide range of personalities and mental health profiles. He gave them code names based on their character traits, including Drill, Quartz, Locust, Hinge, and Lawful—more on him later. When students arrived for the debate, they discovered that their sparring partner was not a peer but a law student. What they didn’t know was that the law student was in cahoots with the research team: his task was to spend eighteen minutes launching an aggressive assault on their worldviews. Murray called it a “stressful interpersonal disputation,” having directed the law student to make the participants angry and anxious with a “mode of attack” that was “vehement, sweeping, and personally abusive.” The poor students sweated and shouted as they struggled to defend their ideals. The pain didn’t stop there. In the weeks that followed, the students were invited back to the lab to discuss the films of their own interactions. They watched themselves grimacing and stringing together incoherent sentences. All in all, they spent about eight hours reliving those humiliating eighteen minutes. A quarter century later, when the participants reflected on the experience, it was clear that many had found it agonizing. Drill described feeling “unabating rage.” Locust recalled his bewilderment, anger, chagrin, and discomfort. “They have deceived me, telling me there was going to be a discussion, when in fact there was an attack,” he wrote. “How could they have done this to me; what is the point of this?” Other participants had a strikingly different response: they actually seemed to get a kick out of being forced to rethink their beliefs. “Some may have found the experience mildly discomforting, in that their cherished (and in my case, at least, sophomoric) philosophies were challenged in an aggressive manner,” one participant remembers. “But it was hardly an experience that would blight one for a week, let alone a life.” Another described the whole series of events as “highly agreeable.” A third went so far as to call it “fun.” Ever since I first read about the participants who reacted enthusiastically, I’ve been fascinated by what made them tick. How did they manage to enjoy the experience of having their beliefs eviscerated—and how can the rest of us learn to do the same? Since the records of the study are still sealed and the vast majority of the participants haven’t revealed their identities, I did the next best thing: I went searching for people like them. I found a Nobel Prize–winning scientist and two of the world’s top election forecasters. They aren’t just comfortable being wrong; they actually seem to be thrilled by it. I think they can teach us something about how to be more graceful and accepting in moments when we discover that our beliefs might not be true. The goal is not to be wrong more often. It’s to recognize that we’re all wrong more often than we’d like to admit, and the more we deny it, the deeper the hole we dig for ourselves. THE DICTATOR POLICING YOUR THOUGHTS When our son was five, he was excited to learn that his uncle was expecting a child. My wife and I both predicted a boy, and so did our son. A few weeks later, we found out the baby would be a girl. When we broke the news to our son, he burst into tears. “Why are you crying?” I asked. “Is it because you were hoping your new cousin would be a boy?” “No!” he shouted, pounding his fists on the floor. “Because we were wrong!” I explained that being wrong isn’t always a bad thing. It can be a sign that we’ve learned something new—and that discovery itself can be a delight. This realization didn’t come naturally to me. Growing up, I was determined to be right. In second grade I corrected my teacher for misspelling the word lightning as lightening. When trading baseball cards I would rattle off statistics from recent games as proof that the price guide was valuing players inaccurately. My friends found this annoying and started calling me Mr. Facts. It got so bad that one day my best friend announced that he wouldn’t talk to me until I admitted I was wrong. It was the beginning of my journey to become more accepting of my own fallibility. In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions. Did you know that the moon might originally have formed inside a vaporous Earth out of magma rain? That a narwhal’s tusk is actually a tooth? When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise (“Really?”) followed by curiosity (“Tell me more!”) and thrill (“Whoa!”). To paraphrase a line attributed to Isaac Asimov, great discoveries often begin not with “Eureka!” but with “That’s funny . . .” When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information. It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies. They’re all just jealous. You’re really, really, ridiculously good-looking. You’re on the verge of inventing the next Pet Rock. As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.” Our inner dictator also likes to take charge when our deeply held opinions are threatened. In the Harvard study of attacking students’ worldviews, the participant who had the strongest negative reaction was code-named Lawful. He came from a blue-collar background and was unusually precocious, having started college at sixteen and joined the study at seventeen. One of his beliefs was that technology was harming civilization, and he became hostile when his views were questioned. Lawful went on to become an academic, and when he penned his magnum opus, it was clear that he hadn’t changed his mind. His concerns about technology had only intensified: The Industrial Revolution and its consequences have been a disaster for the human race. They have greatly increased the life-expectancy of those of us who live in “advanced” countries, but they have destabilized society, have made life unfulfilling, have subjected human beings to indignities . . . to physical suffering as well . . . and have inflicted severe damage on the natural world. That kind of conviction is a common response to threats. Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor. We become preachers or prosecutors striving to convert or condemn the unenlightened. “Presented with someone else’s argument, we’re quite adept at spotting the weaknesses,” journalist Elizabeth Kolbert writes, but “the positions we’re blind about are our own.” I find this odd, because we weren’t born with our opinions. Unlike our height or raw intelligence, we have full control over what we believe is true. We choose our views, and we can choose to rethink them any time we want. This should be a familiar task, because we have a lifetime of evidence that we’re wrong on a regular basis. I was sure I’d finish a draft of this chapter by Friday. I was certain the cereal with the toucan on the box was Fruit Loops, but I just noticed the box says Froot Loops. I was sure I put the milk back in the fridge last night, but strangely it’s sitting on the counter this morning. The inner dictator manages to prevail by activating an overconfidence cycle. First, our wrong opinions are shielded in filter bubbles, where we feel pride when we see only information that supports our convictions. Then our beliefs are sealed in echo chambers, where we hear only from people who intensify and validate them. Although the resulting fortress can appear impenetrable, there’s a growing community of experts who are determined to break through. ATTACHMENT ISSUES Not long ago I gave a speech at a conference about my research on givers, takers, and matchers. I was studying whether generous, selfish, or fair people were more productive in jobs like sales and engineering. One of the attendees was Daniel Kahneman, the Nobel Prize–winning psychologist who has spent much of his career demonstrating how flawed our intuitions are. He told me afterward that he was surprised by my finding that givers had higher rates of failure than takers and matchers—but higher rates of success, too. When you read a study that surprises you, how do you react? Many people would get defensive, searching for flaws in the study’s design or the statistical analysis. Danny did the opposite. His eyes lit up, and a huge grin appeared on his face. “That was wonderful,” he said. “I was wrong.” Later, I sat down with Danny for lunch and asked him about his reaction. It looked a lot to me like the joy of being wrong—his eyes twinkled as if he was having fun. He said that in his eighty-five years, no one had pointed that out before, but yes, he genuinely enjoys discovering that he was wrong, because it means he is now less wrong than before. I knew the feeling. In college, what first attracted me to social science was reading studies that clashed with my expectations; I couldn’t wait to tell my roommates about all the assumptions I’d been rethinking. In my first independent research project, I tested some predictions of my own, and more than a dozen of my hypotheses turned out to be false.* It was a major lesson in intellectual humility, but I wasn’t devastated. I felt an immediate rush of excitement. Discovering I was wrong felt joyful because it meant I’d learned something. As Danny told me, “Being wrong is the only way I feel sure I’ve learned anything.” Danny isn’t interested in preaching, prosecuting, or politicking. He’s a scientist devoted to the truth. When I asked him how he stays in that mode, he said he refuses to let his beliefs become part of his identity. “I change my mind at a speed that drives my collaborators crazy,” he explained. “My attachment to my ideas is provisional. There’s no unconditional love for them.” Attachment. That’s what keeps us from recognizing when our opinions are off the mark and rethinking them. To unlock the joy of being wrong, we need to detach. I’ve learned that two kinds of detachment are especially useful: detaching your present from your past and detaching your opinions from your identity. Let’s start with detaching your present from your past. In psychology, one way of measuring the similarity between the person you are right now and your former self is to ask: which pair of circles best describes how you see yourself? In the moment, separating your past self from your current self can be unsettling. Even positive changes can lead to negative emotions; evolving your identity can leave you feeling derailed and disconnected. Over time, though, rethinking who you are appears to become mentally healthy—as long as you can tell a coherent story about how you got from past to present you. In one study, when people felt detached from their past selves, they became less depressed over the course of the year. When you feel as if your life is changing direction, and you’re in the process of shifting who you are, it’s easier to walk away from foolish beliefs you once held. My past self was Mr. Facts—I was too fixated on knowing. Now I’m more interested in finding out what I don’t know. As Bridgewater founder Ray Dalio told me, “If you don’t look back at yourself and think, ‘Wow, how stupid I was a year ago,’ then you must not have learned much in the last year.” The second kind of detachment is separating your opinions from your identity. I’m guessing you wouldn’t want to see a doctor whose identity is Professional Lobotomist, send your kids to a teacher whose identity is Corporal Punisher, or live in a town where the police chief’s identity is Stop-and-Frisker. Once upon a time, all of these practices were seen as reasonable and effective. Most of us are accustomed to defining ourselves in terms of our beliefs, ideas, and ideologies. This can become a problem when it prevents us from changing our minds as the world changes and knowledge evolves. Our opinions can become so sacred that we grow hostile to the mere thought of being wrong, and the totalitarian ego leaps in to silence counterarguments, squash contrary evidence, and close the door on learning. Who you are should be a question of what you value, not what you believe. Values are your core principles in life—they might be excellence and generosity, freedom and fairness, or security and integrity. Basing your identity on these kinds of principles enables you to remain open-minded about the best ways to advance them. You want the doctor whose identity is protecting health, the teacher whose identity is helping students learn, and the police chief whose identity is promoting safety and justice. When they define themselves by values rather than opinions, they buy themselves the flexibility to update their practices in light of new evidence. THE YODA EFFECT: “YOU MUST UNLEARN WHAT YOU HAVE LEARNED” On my quest to find people who enjoy discovering they were wrong, a trusted colleague told me I had to meet Jean-Pierre Beugoms. He’s in his late forties, and he’s the sort of person who’s honest to a fault; he tells the truth even if it hurts. When his son was a toddler, they were watching a space documentary together, and Jean-Pierre casually mentioned that the sun would one day turn into a red giant and engulf the Earth. His son was not amused. Between tears, he cried, “But I love this planet!” Jean-Pierre felt so terrible that he decided to bite his tongue instead of mentioning threats that could prevent the Earth from even lasting that long. Back in the 1990s, Jean-Pierre had a hobby of collecting the predictions that pundits made on the news and scoring his own forecasts against them. Eventually he started competing in forecasting tournaments—international contests hosted by Good Judgment, where people try to predict the future. It’s a daunting task; there’s an old saying that historians can’t even predict the past. A typical tournament draws thousands of entrants from around the world to anticipate big political, economic, and technological events. The questions are time-bound, with measurable, specific results. Will the current president of Iran still be in office in six months? Which soccer team will win the next World Cup? In the following year, will an individual or a company face criminal charges for an accident involving a self-driving vehicle? Participants don’t just answer yes or no; they have to give their odds. It’s a systematic way of testing whether they know what they don’t know. They get scored months later on accuracy and calibration—earning points not just for giving the right answer, but also for having the right level of conviction. The best forecasters have confidence in their predictions that come true and doubt in their predictions that prove false. On November 18, 2015, Jean-Pierre registered a prediction that stunned his opponents. A day earlier, a new question had popped up in an open forecasting tournament: in July 2016, who would win the U.S. Republican presidential primary? The options were Jeb Bush, Ben Carson, Ted Cruz, Carly Fiorina, Marco Rubio, Donald Trump, and none of the above. With eight months to go before the Republican National Convention, Trump was largely seen as a joke. His odds of becoming the Republican nominee were only 6 percent according to Nate Silver, the celebrated statistician behind the website FiveThirtyEight. When Jean-Pierre peered into his crystal ball, though, he decided Trump had a 68 percent chance of winning. Jean-Pierre didn’t just excel in predicting the results of American events. His Brexit forecasts hovered in the 50 percent range when most of his competitors thought the referendum had little chance of passing. He successfully predicted that the incumbent would lose a presidential election in Senegal, even though the base rates of reelection were extremely high and other forecasters were expecting a decisive win. And he had, in fact, pegged Trump as the favorite long before pundits and pollsters even considered him a viable contender. “It’s striking,” Jean-Pierre wrote early on, back in 2015, that so many forecasters are “still in denial about his chances.” Based on his performance, Jean-Pierre might be the world’s best election forecaster. His advantage: he thinks like a scientist. He’s passionately dispassionate. At various points in his life, Jean-Pierre has changed his political ideologies and religious beliefs.* He doesn’t come from a polling or statistics background; he’s a military historian, which means he has no stake in the way things have always been done in forecasting. The statisticians were attached to their views about how to aggregate polls. Jean-Pierre paid more attention to factors that were hard to measure and overlooked. For Trump, those included “Mastery at manipulating the media; Name recognition; and A winning issue (i.e., immigration and ‘the wall’).” Even if forecasting isn’t your hobby, there’s a lot to be learned from studying how forecasters like Jean-Pierre form their opinions. My colleague Phil Tetlock finds that forecasting skill is less a matter of what we know than of how we think. When he and his collaborators studied a host of factors that predict excellence in forecasting, grit and ambition didn’t rise to the top. Neither did intelligence, which came in second. There was another factor that had roughly triple the predictive power of brainpower. The single most important driver of forecasters’ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions. A key question here is how much rethinking is necessary. Although the sweet spot will always vary from one person and situation to the next, the averages can give us a clue. A few years into their tournaments, typical competitors updated their predictions about twice per question. The superforecasters updated their predictions more than four times per question. Think about how manageable that is. Better judgment doesn’t necessarily require hundreds or even dozens of updates. Just a few more efforts at rethinking can move the needle. It’s also worth noting, though, how unusual that level of rethinking is. How many of us can even remember the last time we admitted being wrong and revised our opinions accordingly? As journalist Kathryn Schulz observes, “Although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.” That’s where the best forecasters excelled: they were eager to think again. They saw their opinions more as hunches than as truths—as possibilities to entertain rather than facts to embrace. They questioned ideas before accepting them, and they were willing to keep questioning them even after accepting them. They were constantly seeking new information and better evidence—especially disconfirming evidence. On Seinfeld, George Costanza famously said, “It’s not a lie if you believe it.” I might add that it doesn’t become the truth just because you believe it. It’s a sign of wisdom to avoid believing every thought that enters your mind. It’s a mark of emotional intelligence to avoid internalizing every feeling that enters your heart. Ellis Rosen/The New Yorker Collection/The Cartoon Bank Another of the world’s top forecasters is Kjirste Morrell. She’s obviously bright—she has a doctorate from MIT in mechanical engineering—but her academic and professional experience wasn’t exactly relevant to predicting world events. Her background was in human hip joint mechanics, designing better shoes, and building robotic wheelchairs. When I asked Kjirste what made her so good at forecasting, she replied, “There’s no benefit to me for being wrong for longer. It’s much better if I change my beliefs sooner, and it’s a good feeling to have that sense of a discovery, that surprise—I would think people would enjoy that.” Kjirste hasn’t just figured out how to erase the pain of being wrong. She’s transformed it into a source of pleasure. She landed there through a form of classical conditioning, like when Pavlov’s dog learned to salivate at the sound of a bell. If being wrong repeatedly leads us to the right answer, the experience of being wrong itself can become joyful. That doesn’t mean we’ll enjoy it every step of the way. One of Kjirste’s biggest misses was her forecast for the 2016 U.S. presidential election, where she bet on Hillary Clinton to beat Donald Trump. Since she wasn’t a Trump supporter, the prospect of being wrong was painful—it was too central to her identity. She knew a Trump presidency was possible, but she didn’t want to think it was probable, so she couldn’t bring herself to forecast it. That was a common mistake in 2016. Countless experts, pollsters, and pundits underestimated Trump—and Brexit—because they were too emotionally invested in their past predictions and identities. If you want to be a better forecaster today, it helps to let go of your commitment to the opinions you held yesterday. Just wake up in the morning, snap your fingers, and decide you don’t care. It doesn’t matter who’s president or what happens to your country. The world is unjust and the expertise you spent decades developing is obsolete! It’s a piece of cake, right? About as easy as willing yourself to fall out of love. Somehow, Jean-Pierre Beugoms managed to pull it off. When Donald Trump first declared his candidacy in the spring of 2015, Jean-Pierre gave him only a 2 percent chance of becoming the nominee. As Trump began rising in the August polls, Jean-Pierre was motivated to question himself. He detached his present from his past, acknowledging that his original prediction was understandable, given the information he had at the time. Detaching his opinions from his identity was harder. Jean-Pierre didn’t want Trump to win, so it would’ve been easy to fall into the trap of desirability bias. He overcame it by focusing on a different goal. “I wasn’t so attached to my original forecast,” he explained, because of “the desire to win, the desire to be the best forecaster.” He still had a stake in the outcome he actually preferred, but he had an even bigger stake in not making a mistake. His values put truth above tribe: “If the evidence strongly suggests that my tribe is wrong on a particular issue, then so be it. I consider all of my opinions tentative. When the facts change, I change my opinions.” Research suggests that identifying even a single reason why we might be wrong can be enough to curb overconfidence. Jean-Pierre went further; he made a list of all the arguments that pundits were making about why Trump couldn’t win and went looking for evidence that they (and he) were wrong. He found that evidence within the polls: in contrast with widespread claims that Trump was a factional candidate with narrow appeal, Jean-Pierre saw that Trump was popular across key Republican demographic groups. By mid-September, Jean-Pierre was an outlier, putting Trump’s odds of becoming the nominee over 50 percent. “Accept the fact that you’re going to be wrong,” Jean-Pierre advises. “Try to disprove yourself. When you’re wrong, it’s not something to be depressed about. Say, ‘Hey, I discovered something!’” MISTAKES WERE MADE . . . MOST LIKELY BY ME As prescient as Jean-Pierre’s bet on Trump was, he still had trouble sticking to it in the face of his feelings. In the spring of 2016, he identified the media coverage of Hillary Clinton’s emails as a red flag, and kept predicting a Trump victory for two months more. By the summer, though, as he contemplated the impending possibility of a Trump presidency, he found himself struggling to sleep at night. He changed his forecast to Clinton. Looking back, Jean-Pierre isn’t defensive about his decision. He freely admits that despite being an experienced forecaster, he made the rookie mistake of falling victim to desirability bias, allowing his preference to cloud his judgment. He focused on the forces that would enable him to predict a Clinton win because he desperately wanted a Trump loss. “That was just a way of me trying to deal with this unpleasant forecast I had issued,” he says. Then he does something unexpected: he laughs at himself. If we’re insecure, we make fun of others. If we’re comfortable being wrong, we’re not afraid to poke fun at ourselves. Laughing at ourselves reminds us that although we might take our decisions seriously, we don’t have to take ourselves too seriously. Research suggests that the more frequently we make fun of ourselves, the happier we tend to be.* Instead of beating ourselves up about our mistakes, we can turn some of our past misconceptions into sources of present amusement. Being wrong won’t always be joyful. The path to embracing mistakes is full of painful moments, and we handle those moments better when we remember they’re essential for progress. But if we can’t learn to find occasional glee in discovering we were wrong, it will be awfully hard to get anything right. I’ve noticed a paradox in great scientists and superforecasters: the reason they’re so comfortable being wrong is that they’re terrified of being wrong. What sets them apart is the time horizon. They’re determined to reach the correct answer in the long run, and they know that means they have to be open to stumbling, backtracking, and rerouting in the short run. They shun rose-colored glasses in favor of a sturdy mirror. The fear of missing the mark next year is a powerful motivator to get a crystal-clear view of last year’s mistakes. “People who are right a lot listen a lot, and they change their mind a lot,” Jeff Bezos says. “If you don’t change your mind frequently, you’re going to be wrong a lot.” Jean-Pierre Beugoms has a favorite trick for catching himself when he’s wrong. When he makes a forecast, he also makes a list of the conditions in which it should hold true—as well as the conditions under which he would change his mind. He explains that this keeps him honest, preventing him from getting attached to a bad prediction. What forecasters do in tournaments is good practice in life. When you form an opinion, ask yourself what would have to happen to prove it false. Then keep track of your views so you can see when you were right, when you were wrong, and how your thinking has evolved. “I started out just wanting to prove myself,” Jean-Pierre says. “Now I want to improve myself—to see how good I can get.” It’s one thing to admit to ourselves that we’ve been wrong. It’s another thing to confess that to other people. Even if we manage to overthrow our inner dictator, we run the risk of facing outer ridicule. In some cases we fear that if others find out we were wrong, it could destroy our reputations. How do people who accept being wrong cope with that? In the early 1990s, the British physicist Andrew Lyne published a major discovery in the world’s most prestigious science journal. He presented the first evidence that a planet could orbit a neutron star—a star that had exploded into a supernova. Several months later, while preparing to give a presentation at an astronomy conference, he noticed that he hadn’t adjusted for the fact that the Earth moves in an elliptical orbit, not a circular one. He was embarrassingly, horribly wrong. The planet he had discovered didn’t exist. In front of hundreds of colleagues, Andrew walked onto the ballroom stage and admitted his mistake. When he finished his confession, the room exploded in a standing ovation. One astrophysicist called it “the most honorable thing I’ve ever seen.” Andrew Lyne is not alone. Psychologists find that admitting we were wrong doesn’t make us look less competent. It’s a display of honesty and a willingness to learn. Although scientists believe it will damage their reputation to admit that their studies failed to replicate, the reverse is true: they’re judged more favorably if they acknowledge the new data rather than deny them. After all, it doesn’t matter “whose fault it is that something is broken if it’s your responsibility to fix it,” actor Will Smith has said. “Taking responsibility is taking your power back.” When we find out we might be wrong, a standard defense is “I’m entitled to my opinion.” I’d like to modify that: yes, we’re entitled to hold opinions inside our own heads. If we choose to express them out loud, though, I think it’s our responsibility to ground them in logic and facts, share our reasoning with others, and change our minds when better evidence emerges. This philosophy takes us back to the Harvard students who had their worldviews attacked in that unethical study by Henry Murray. If I had to guess, I’d say the students who enjoyed the experience had a mindset similar to that of great scientists and superforecasters. They saw challenges to their opinions as an exciting opportunity to develop and evolve their thinking. The students who found it stressful didn’t know how to detach. Their opinions were their identities. An assault on their worldviews was a threat to their very sense of self. Their inner dictator rushed in to protect them. Take it from the student with the code name Lawful. He felt he had been damaged emotionally by the study. “Our adversary in the debate subjected us to various insults,” Lawful reflected four decades later. “It was a highly unpleasant experience.” Today, Lawful has a different code name, one that’s familiar to most Americans. He’s known as the Unabomber. Ted Kaczynski became a math professor turned anarchist and domestic terrorist. He mailed bombs that killed three people and injured twenty-three more. An eighteen-year-long FBI investigation culminated in his arrest after The New York Times and The Washington Post published his manifesto and his brother recognized his writing. He is now serving life in prison without parole. The excerpt I quoted earlier was from Kaczynski’s manifesto. If you read the entire document, you’re unlikely to be unsettled by the content or the structure. What’s disturbing is the level of conviction. Kaczynski displays little consideration of alternative views, barely a hint that he might be wrong. Consider just the opening: The Industrial Revolution and its consequences have been a disaster for the human race. . . . They have destabilized society, have made life unfulfilling. . . . The continued development of technology will worsen the situation. It will certainly subject human beings to greater indignities and inflict greater damage on the natural world. . . . If the system survives, the consequences will be inevitable: There is no way of reforming or modifying the system. . . . Kaczynski’s case leaves many questions about his mental health unanswered. Still, I can’t help but wonder: If he had learned to question his opinions, would he still have been able to justify resorting to violence? If he had developed the capacity to discover that he was wrong, would he still have ended up doing something so wrong? Every time we encounter new information, we have a choice. We can attach our opinions to our identities and stand our ground in the stubbornness of preaching and prosecuting. Or we can operate more like scientists, defining ourselves as people committed to the pursuit of truth—even if it means proving our own views wrong. CHAPTER 4 The Good Fight Club The Psychology of Constructive Conflict Arguments are extremely vulgar, for everybody in good society holds exactly the same opinions. —OSCAR WILDE A s the two youngest boys in a big family, the bishop’s sons did everything together. They launched a newspaper and built their own printing press together. They opened a bicycle shop and then started manufacturing their own bikes together. And after years of toiling away at a seemingly impossible problem, they invented the first successful airplane together. Wilbur and Orville Wright first caught the flying bug when their father brought home a toy helicopter. After it broke, they built one of their own. As they advanced from playing together to working together to rethinking human flight together, there was no trace of sibling rivalry between them. Wilbur even said they “thought together.” Even though it was Wilbur who launched the project, the brothers shared equal credit for their achievement. When it came time to decide who would pilot their historic flight at Kitty Hawk, they just flipped a coin. New ways of thinking often spring from old bonds. The comedic chemistry of Tina Fey and Amy Poehler can be traced back to their early twenties, when they immediately hit it off in an improv class. The musical harmony of the Beatles started even earlier, when they were in high school. Just minutes after a mutual friend introduced them, Paul McCartney was teaching John Lennon how to tune a guitar. Ben and Jerry’s Ice Cream grew out of a friendship between the two founders that began in seventh-grade gym class. It seems that to make progress together, we need to be in sync. But the truth, like all truths, is more complicated. One of the world’s leading experts on conflict is an organizational psychologist in Australia named Karen “Etty” Jehn. When you think about conflict, you’re probably picturing what Etty calls relationship conflict—personal, emotional clashes that are filled not just with friction but also with animosity. I hate your stinking guts. I’ll use small words so that you’ll be sure to understand, you warthog-faced buffoon. You bob for apples in the toilet . . . and you like it. But Etty has identified another flavor called task conflict—clashes about ideas and opinions. We have task conflict when we’re debating whom to hire, which restaurant to pick for dinner, or whether to name our child Gertrude or Quasar. The question is whether the two types of conflict have different consequences. A few years ago I surveyed hundreds of new teams in Silicon Valley on conflict several times during their first six months working together. Even if they argued constantly and agreed on nothing else, they agreed on what kind of conflict they were having. When their projects were finished, I asked their managers to evaluate each team’s effectiveness. The teams that performed poorly started with more relationship conflict than task conflict. They entered into personal feuds early on and were so busy disliking one another that they didn’t feel comfortable challenging one another. It took months for many of the teams to make real headway on their relationship issues, and by the time they did manage to debate key decisions, it was often too late to rethink their directions. What happened in the high-performing groups? As you might expect, they started with low relationship conflict and kept it low throughout their work together. That didn’t stop them from having task conflict at the outset: they didn’t hesitate to surface competing perspectives. As they resolved some of their differences of opinion, they were able to align on a direction and carry out their work until they ran into new issues to debate. All in all, more than a hundred studies have examined conflict types in over eight thousand teams. A meta-analysis of those studies showed that relationship conflict is generally bad for performance, but some task conflict can be beneficial: it’s been linked to higher creativity and smarter choices. For example, there’s evidence that when teams experience moderate task conflict early on, they generate more original ideas in Chinese technology companies, innovate more in Dutch delivery services, and make better decisions in American hospitals. As one research team concluded, “The absence of conflict is not harmony, it’s apathy.” Relationship conflict is destructive in part because it stands in the way of rethinking. When a clash gets personal and emotional, we become self-righteous preachers of our own views, spiteful prosecutors of the other side, or single-minded politicians who dismiss opinions that don’t come from our side. Task conflict can be constructive when it brings diversity of thought, preventing us from getting trapped in overconfidence cycles. It can help us stay humble, surface doubts, and make us curious about what we might be missing. That can lead us to think again, moving us closer to the truth without damaging our relationships. Although productive disagreement is a critical life skill, it’s one that many of us never fully develop. The problem starts early: parents disagree behind closed doors, fearing that conflict will make children anxious or somehow damage their character. Yet research shows that how often parents argue has no bearing on their children’s academic, social, or emotional development. What matters is how respectfully parents argue, not how frequently. Kids whose parents clash constructively feel more emotionally safe in elementary school, and over the next few years they actually demonstrate more helpfulness and compassion toward their classmates. Being able to have a good fight doesn’t just make us more civil; it also develops our creative muscles. In a classic study, highly creative architects were more likely than their technically competent but less original peers to come from homes with plenty of friction. They often grew up in households that were “tense but secure,” as psychologist Robert Albert notes: “The creative person-to-be comes from a family that is anything but harmonious, one with a ‘wobble.’” The parents weren’t physically or verbally abusive, but they didn’t shy away from conflict, either. Instead of telling their children to be seen but not heard, they encouraged them to stand up for themselves. The kids learned to dish it out—and take it. That’s exactly what happened to Wilbur and Orville Wright. When the Wright brothers said they thought together, what they really meant is that they fought together. Arguing was the family business. Although their father was a bishop in the local church, he included books by atheists in his library—and encouraged the children to read and debate them. They developed the courage to fight for their ideas and the resilience to lose a disagreement without losing their resolve. When they were solving problems, they had arguments that lasted not just for hours but for weeks and months at a time. They didn’t have such incessant spats because they were angry. They kept quarreling because they enjoyed it and learned from the experience. “I like scrapping with Orv,” Wilbur reflected. As you’ll see, it was one of their most passionate and prolonged arguments that led them to rethink a critical assumption that had prevented humans from soaring through the skies. THE PLIGHT OF THE PEOPLE PLEASER As long as I can remember, I’ve been determined to keep the peace. Maybe it’s because my group of friends dropped me in middle school. Maybe it’s genetic. Maybe it’s because my parents got divorced. Whatever the cause, in psychology there’s a name for my affliction. It’s called agreeableness, and it’s one of the major personality traits around the world. Agreeable people tend to be nice. Friendly. Polite. Canadian.* My first impulse is to avoid even the most trivial of conflicts. When I’m riding in an Uber and the air-conditioning is blasting, I struggle to bring myself to ask the driver to turn it down—I just sit there shivering in silence until my teeth start to chatter. When someone steps on my shoe, I’ve actually apologized for inconveniently leaving my foot in his path. When students fill out course evaluations, one of their most common complaints is that I’m “too supportive of stupid comments.” Disagreeable people tend to be more critical, skeptical, and challenging—and they’re more likely than their peers to become engineers and lawyers. They’re not just comfortable with conflict; it energizes them. If you’re highly disagreeable, you might be happier in an argument than in a friendly conversation. That quality often comes with a bad rap: disagreeable people get stereotyped as curmudgeons who complain about every idea, or Dementors who suck the joy out of every meeting. When I studied Pixar, though, I came away with a dramatically different view. In 2000, Pixar was on fire. Their teams had used computers to rethink animation in their first blockbuster, Toy Story, and they were fresh off two more smash hits. Yet the company’s founders weren’t content to rest on their laurels. They recruited an outside director named Brad Bird to shake things up. Brad had just released his debut film, which was well reviewed but flopped at the box office, so he was itching to do something big and bold. When he pitched his vision, the technical leadership at Pixar said it was impossible: they would need a decade and $500 million to make it. Brad wasn’t ready to give up. He sought out the biggest misfits at Pixar for his project—people who were disagreeable, disgruntled, and dissatisfied. Some called them black sheep. Others called them pirates. When Brad rounded them up, he warned them that no one believed they could pull off the project. Just four years later, his team didn’t only succeed in releasing Pixar’s most complex film ever; they actually managed to lower the cost of production per minute. The Incredibles went on to gross upwards of $631 million worldwide and won the Oscar for Best Animated Feature. Notice what Brad didn’t do. He didn’t stock his team with agreeable people. Agreeable people make for a great support network: they’re excited to encourage us and cheerlead for us. Rethinking depends on a different kind of network: a challenge network, a group of people we trust to point out our blind spots and help us overcome our weaknesses. Their role is to activate rethinking cycles by pushing us to be humble about our expertise, doubt our knowledge, and be curious about new perspectives. The ideal members of a challenge network are disagreeable, because they’re fearless about questioning the way things have always been done and holding us accountable for thinking again. There’s evidence that disagreeable people speak up more frequently—especially when leaders aren’t receptive—and foster more task conflict. They’re like the doctor in the show House or the boss in the film The Devil Wears Prada. They give the critical feedback we might not want to hear, but need to hear. Harnessing disagreeable people isn’t always easy. It helps if certain conditions are in place. Studies in oil drilling and tech companies suggest that dissatisfaction promotes creativity only when people feel committed and supported—and that cultural misfits are most likely to add value when they have strong bonds with their colleagues.* Before Brad Bird arrived, Pixar already had a track record of encouraging talented people to push boundaries. But the studio’s previous films had starred toys, bugs, and monsters, which were relatively simple to animate. Since making a whole film with lifelike human superheroes was beyond the capabilities of computer animation at the time, the technical teams balked at Brad’s vision for The Incredibles. That’s when he created his challenge network. He enlisted his band of pirates to foster task conflict and rethink the process. Brad gathered the pirates in Pixar’s theater and told them that although a bunch of bean counters and corporate suits might not believe in them, he did. After rallying them he went out of his way to seek out their ideas. “I want people who are disgruntled because they have a better way of doing things and they are having trouble finding an avenue,” Brad told me. “Racing cars that are just spinning their wheels in a garage rather than racing. You open that garage door, and man, those people will take you somewhere.” The pirates rose to the occasion, finding economical alternatives to expensive techniques and easy workarounds for hard problems. When it came time to animate the superhero family, they didn’t toil over the intricate contours of interlocking muscles. Instead they figured out that sliding simple oval shapes against one another could become the building blocks of complex muscles. When I asked Brad how he recognized the value of pirates, he told me it was because he is one. Growing up, when he went to dinner at friends’ houses, he was taken aback by the polite questions their parents asked about their day at school. Bird family dinners were more like a food fight, where they all vented, debated, and spoke their minds. Brad found the exchanges contentious but fun, and he brought that mentality into his first dream job at Disney. From an early age, he had been mentored and trained by a group of old Disney masters to put quality first, and he was frustrated that their replacements—who now supervised the new generation at the studio—weren’t upholding the same standards. Within a few months of launching his animation career at Disney, Brad was criticizing senior leaders for taking on conventional projects and producing substandard work. They told him to be quiet and do his job. When he refused, they fired him. I’ve watched too many leaders shield themselves from task conflict. As they gain power, they tune out boat-rockers and listen to bootlickers. They become politicians, surrounding themselves with agreeable yes-men and becoming more susceptible to seduction by sycophants. Research reveals that when their firms perform poorly, CEOs who indulge flattery and conformity become overconfident. They stick to their existing strategic plans instead of changing course—which sets them on a collision course with failure. We learn more from people who challenge our thought process than those who affirm our conclusions. Strong leaders engage their critics and make themselves stronger. Weak leaders silence their critics and make themselves weaker. This reaction isn’t limited to people in power. Although we might be on board with the principle, in practice we often miss out on the value of a challenge network. In one experiment, when people were criticized rather than praised by a partner, they were over four times more likely to request a new partner. Across a range of workplaces, when employees received tough feedback from colleagues, their default response was to avoid those coworkers or drop them from their networks altogether—and their performance suffered over the following year. Some organizations and occupations counter those tendencies by building challenge networks into their cultures. From time to time the Pentagon and the White House have used aptly named “murder boards” to stir up task conflict, enlisting tough-minded committees to shoot down plans and candidates. At X, Google’s “moonshot factory,” there’s a rapid evaluation team that’s charged with rethinking proposals: members conduct independent assessments and only advance the ones that emerge as both audacious and achievable. In science, a challenge network is often a cornerstone of the peer-review process. We submit articles anonymously, and they’re reviewed blindly by independent experts. I’ll never forget the rejection letter I once received in which one of the reviewers encouraged me to go back and read the work of Adam Grant. Dude, I am Adam Grant. When I write a book, I like to enlist my own challenge network. I recruit a group of my most thoughtful critics and ask them to tear each chapter apart. I’ve learned that it’s important to consider their values along with their personalities—I’m looking for disagreeable people who are givers, not takers. Disagreeable givers often make the best critics: their intent is to elevate the work, not feed their own egos. They don’t criticize because they’re insecure; they challenge because they care. They dish out tough love.* Ernest Hemingway once said, “The most essential gift for a good writer is a built-in, shock-proof sh*t detector.” My challenge network is my sh*t detector. I think of it as a good fight club. The first rule: avoiding an argument is bad manners. Silence disrespects the value of your views and our ability to have a civil disagreement. Brad Bird lives by that rule. He has legendary arguments with his long-standing producer, John Walker. When making The Incredibles, they fought about every character detail, right down to their hair—from how receding the hairline should be on the superhero dad to whether the teenage daughter’s hair should be long and flowing. At one point, Brad wanted the baby to morph into goo, taking on a jellylike shape, but John put his foot down. It would be too difficult to animate, and they were too far behind schedule. “I’m just trying to herd you toward the finish,” John said, laughing. “I’m just trying to get us across the line, man.” Pounding his fist, Brad shot back: “I’m trying to get us across the line in first place.” Eventually John talked Brad out of it, and the goo was gone. “I love working with John, because he’ll give me the bad news straight to my face,” Brad says. “It’s good that we disagree. It’s good that we fight it out. It makes the stuff stronger.” Those fights have helped Brad win two Oscars—and made him a better learner and a better leader. For John’s part, he didn’t flat-out refuse to animate a gooey baby. He just told Brad he would have to wait a little bit. Sure enough, when they got around to releasing a sequel to The Incredibles fourteen years later, the baby got into a fight with a raccoon and transformed into goo. That scene might be the hardest I’ve ever seen my kids laugh. DON’T AGREE TO DISAGREE Hashing out competing views has potential downsides—risks that need to be managed. On the first Incredibles film, a rising star named Nicole Grindle had managed the simulation of the hair, watching John and Brad’s interactions from a distance. When Nicole came in to produce the sequel with John, one of her concerns was that the volume of the arguments between the two highly accomplished leaders might drown out the voices of people who were less comfortable speaking up: newcomers, introverts, women, and minorities. It’s common for people who lack power or status to shift into politician mode, suppressing their dissenting views in favor of conforming to the HIPPO—the HIghest Paid Person’s Opinion. Sometimes they have no other choice if they want to survive. To make sure their desire for approval didn’t prevent them from introducing task conflict, Nicole encouraged new people to bring their divergent ideas to the table. Some voiced them directly to the group; others went to her for feedback and support. Although Nicole wasn’t a pirate, as she found herself advocating for different perspectives she became more comfortable challenging Brad on characters and dialogue. “Brad is still the ornery guy who first came to Pixar, so you have to be ready for a spirited debate when you put forward a contrary point of view.” The notion of a spirited debate captures something important about how and why good fights happen. If you watch Brad argue with his colleagues—or the pirates fight with one another—you can quickly see that the tension is intellectual, not emotional. The tone is vigorous and feisty rather than combative or aggressive. They don’t disagree just for the sake of it; they disagree because they care. “Whether you disagree loudly, or quietly yet persistently put forward a different perspective,” Nicole explains, “we come together to support the common goal of excellence—of making great films.” After seeing their interactions up close, I finally understood what had long felt like a contradiction in my own personality: how I could be highly agreeable and still cherish a good argument. Agreeableness is about seeking social harmony, not cognitive consensus. It’s possible to disagree without being disagreeable. Although I’m terrified of hurting other people’s feelings, when it comes to challenging their thoughts, I have no fear. In fact, when I argue with someone, it’s not a display of disrespect—it’s a sign of respect. It means I value their views enough to contest them. If their opinions didn’t matter to me, I wouldn’t bother. I know I have chemistry with someone when we find it delightful to prove each other wrong. Agreeable people don’t always steer clear of conflict. They’re highly attuned to the people around them and often adapt to the norms in the room. My favorite demonstration is an experiment by my colleagues Jennifer Chatman and Sigal Barsade. Agreeable people were significantly more accommodating than disagreeable ones—as long as they were in a cooperative team. When they were assigned to a competitive team, they acted just as disagreeably as their disagreeable teammates. That’s how working with Brad Bird influenced John Walker. John’s natural tendency is to avoid conflict: at restaurants, if the waiter brings him the wrong dish, he just goes ahead and eats it anyway. “But when I’m involved in something bigger than myself,” he observes, “I feel like I have an opportunity, a responsibility really, to speak up, speak out, debate. Fight like hell when the morning whistle blows, but go out for a beer after the one at five o’clock.” That adaptability was also visible in the Wright brothers’ relationship. In Wilbur, Orville had a built-in challenge network. Wilbur was known to be highly disagreeable: he was unfazed by other people’s opinions and had a habit of pouncing on anyone else’s idea the moment it was raised. Orville was known as gentle, cheerful, and sensitive to criticism. Yet those qualities seemed to vanish in his partnership with his brother. “He’s such a good scrapper,” Wilbur said. One sleepless night Orville came up with an idea to build a rudder that was movable rather than fixed. The next morning at breakfast, as he got ready to pitch the idea to Wilbur, Orville winked at a colleague of theirs, expecting Wilbur to go into challenge mode and demolish it. Much to his surprise, Wilbur saw the potential in the idea immediately, and it became one of their major discoveries. Disagreeable people don’t just challenge us to think again. They also make agreeable people comfortable arguing, too. Instead of fleeing from friction, our grumpy colleagues engage it directly. By making it clear that they can handle a tussle, they create a norm for the rest of us to follow. If we’re not careful, though, what starts as a scuffle can turn into a brawl. How can we avoid that slippery slope? GETTING HOT WITHOUT GETTING MAD A major problem with task conflict is that it often spills over into relationship conflict. One minute you’re disagreeing about how much seasoning to put on the Thanksgiving turkey, and the next minute you find yourself yelling “You smell!” Although the Wright brothers had a lifetime of experience discovering each other’s hot buttons, that didn’t mean they always kept their cool. Their last grand challenge before liftoff was their single hardest problem: designing a propeller. They knew their airplane couldn’t take flight without one, but the right kind didn’t exist. As they struggled with various approaches, they argued back and forth for hours at a time, often raising their voices. The feuding lasted for months as each took turns preaching the merits of his own solutions and prosecuting the other’s points. Eventually their younger sister, Katharine, threatened to leave the house if they didn’t stop fighting. They kept at it anyway, until one night it culminated in what might have been the loudest shouting match of their lives. Strangely, the next morning, they came into the shop and acted as if nothing had happened. They picked up the argument about the propeller right where they had left off—only now without the yelling. Soon they were both rethinking their assumptions and stumbling onto what would become one of their biggest breakthroughs. The Wright brothers were masters at having intense task conflict without relationship conflict. When they raised their voices, it reflected intensity rather than hostility. As their mechanic marveled, “I don’t think they really got mad, but they sure got awfully hot.” Experiments show that simply framing a dispute as a debate rather than as a disagreement signals that you’re receptive to considering dissenting opinions and changing your mind, which in turn motivates the other person to share more information with you. A disagreement feels personal and potentially hostile; we expect a debate to be about ideas, not emotions. Starting a disagreement by asking, “Can we debate?” sends a message that you want to think like a scientist, not a preacher or a prosecutor—and encourages the other person to think that way, too. The Wright brothers had the benefit of growing up in a family where disagreements were seen as productive and enjoyable. When arguing with others, though, they often had to go out of their way to reframe their behavior. “Honest argument is merely a process of mutually picking the beams and motes out of each other’s eyes so both can see clearly,” Wilbur once wrote to a colleague whose ego was bruised after a fiery exchange about aeronautics. Wilbur stressed that it wasn’t personal: he saw arguments as opportunities to test and refine their thinking. “I see that you are back at your old trick of giving up before you are half beaten in an argument. I feel pretty certain of my own ground but was anticipating the pleasure of a good scrap before the matter was settled. Discussion brings out new ways of looking at things.” When they argued about the propeller, the Wright brothers were making a common mistake. Each was preaching about why he was right and why the other was wrong. When we argue about why, we run the risk of becoming emotionally attached to our positions and dismissive of the other side’s. We’re more likely to have a good fight if we argue about how. When social scientists asked people why they favor particular policies on taxes, health care, or nuclear sanctions, they often doubled down on their convictions. Asking people to explain how those policies would work in practice—or how they’d explain them to an expert—activated a rethinking cycle. They noticed gaps in their knowledge, doubted their conclusions, and became less extreme; they were now more curious about alternative options. Psychologists find that many of us are vulnerable to an illusion of explanatory depth. Take everyday objects like a bicycle, a piano, or earbuds: how well do you understand them? People tend to be overconfident in their knowledge: they believe they know much more than they actually do about how these objects work. We can help them see the limits of their understanding by asking them to unpack the mechanisms. How do the gears on a bike work? How does a piano key make music? How do earbuds transmit sound from your phone to your ears? People are surprised by how much they struggle to answer those questions and quickly realize how little they actually know. That’s what happened to the Wright brothers after their yelling match. The next morning, the Wright brothers approached the propeller problem differently. Orville showed up at the shop first and told their mechanic that he had been wrong: they should design the propeller Wilbur’s way. Then Wilbur arrived and started arguing against his own idea, suggesting that Orville might be right. As they shifted into scientist mode, they focused less on why different solutions would succeed or fail, and more on how those solutions might work. Finally they identified problems with both of their approaches, and realized they were both wrong. “We worked out a theory of our own on the subject, and soon discovered,” Orville wrote, “that all the propellers built heretofore are all wrong.” He exclaimed that their new design was “all right (till we have a chance to test them down at Kitty Hawk and find out differently).” Even after building a better solution, they were still open to rethinking it. At Kitty Hawk, they found that it was indeed the right one. The Wright brothers had figured out that their airplane didn’t need a propeller. It needed two propellers, spinning in opposite directions, to function like a rotating wing. That’s the beauty of task conflict. In a great argument, our adversary is not a foil, but a propeller. With twin propellers spinning in divergent directions, our thinking doesn’t get stuck on the ground; it takes flight. PART II Interpersonal Rethinking Opening Other People’s Minds CHAPTER 5 Dances with Foes How to Win Debates and Influence People Exhausting someone in argument is not the same as convincing him. —TIM KREIDER A t thirty-one, Harish Natarajan has won three dozen international debate tournaments. He’s been told it’s a world record. But his opponent today presents a unique challenge. Debra Jo Prectet is a prodigy hailing from Haifa, Israel. She’s just eight years old, and although she made her first foray into public debating only last summer, she’s been preparing for this moment for years. Debra has absorbed countless articles to accumulate knowledge, closely studied speechwriting to hone her clarity, and even practiced her delivery to incorporate humor. Now she’s ready to challenge the champion himself. Her parents are hoping she’ll make history. Harish was a wunderkind too. By the time he was eight, he was outmaneuvering his own parents in dinner-table debates about the Indian caste system. He went on to become the European debate champion and a grand finalist in the world debate championship, and coached the Filipino national school debate team at the world championship. I was introduced to Harish by an unusually bright former student who used to compete against him, and remembers having lost “many (likely all)” of their debates. Harish and Debra are facing off in San Francisco in February 2019 in front of a large crowd. They’ve been kept in the dark about the debate topic. When they walk onstage, the moderator announces the subject: should preschools be subsidized by the government? After just fifteen minutes of preparation, Debra will present her strongest arguments in favor of subsidies, and Harish will marshal his best case against them. Their goal is to win the audience over to their side on preschool subsidies, but their impact on me will be much broader: they’ll end up changing my view of what it takes to win a debate. Debra kicks off with a joke, drawing laughter from the crowd by telling Harish that although he may hold the world record in debate wins, he’s never debated someone like her. Then she goes on to summarize an impressive number of studies—citing her sources—about the academic, social, and professional benefits of preschool programs. For good measure, she quotes a former prime minister’s argument about preschool being a smart investment. Harish acknowledges the facts that Debra presented, but then makes his case that subsidizing preschools is not the appropriate remedy for the damage caused by poverty. He suggests that the issue should be evaluated on two grounds: whether preschool is currently underprovided and underconsumed, and whether it helps those who are the least fortunate. He argues that in a world full of trade-offs, subsidizing preschool is not the best use of taxpayer money. Going into the debate, 92 percent of the audience has already made up their minds. I’m one of them: it didn’t take me long to figure out where I stood on preschool subsidies. In the United States, public education is free from kindergarten through high school. I’m familiar with evidence that early access to education in the first few years of children’s lives may be even more critical to helping them escape poverty than anything they learn later. I believe education is a fundamental human right, like access to water, food, shelter, and health care. That puts me on Team Debra. As I watch the debate, her early arguments strike a chord. Here are some highlights: Debra: Research clearly shows that a good preschool can help kids overcome the disadvantages often associated with poverty. Data for the win! Be still, my beating heart. Debra: You will possibly hear my opponent talk today about different priorities . . . he might say that subsidies are needed, but not for preschools. I would like to ask you, Mr. Natarajan . . . why don’t we examine the evidence and the data and decide accordingly? If Harish has an Achilles’ heel, my former student has told me, it’s that his brilliant arguments aren’t always grounded in facts. Harish: Let me start by examining the main claim . . . that if we believe preschools are good in principle, surely it is worth giving money to subsidize those—but I don’t think that is ever enough of a justification for subsidies. Debra has clearly done her homework. She didn’t just nail Harish on data—she anticipated his counterargument. Debra: The state budget is a big one, and there is room in it to subsidize preschools and invest in other fields. Therefore, the idea that there are more important things to spend on is irrelevant, because the different subsidies are not mutually exclusive. Way to debunk Harish’s case for trade-offs. Bravo. Harish: Maybe the state has the budget to do all the good things. Maybe the state has the budget to provide health care. Maybe it has the budget to provide welfare payments. Maybe it has the budget to provide running water as well as preschool. I would love to live in that world, but I don’t think that is the world we live in. I think we live in a world where there are real constraints on what governments can spend money on—and even if those are not real, those are nonetheless political. D’oh! Valid point. Even if a program has the potential to pay for itself, it takes a lot of political capital to make it happen—capital that could be invested elsewhere. Debra: Giving opportunities to the less fortunate should be a moral obligation of any human being, and it is a key role for the state. To be clear, we should find the funding for preschools and not rely on luck or market forces. This issue is too important to not have a safety net. Yes! This is more than a political or an economic question. It’s a moral question. Harish: I want to start by noting what [we] agree on. We agree that poverty is terrible. It is terrible when individuals do not have running water. It is terrible when . . . they are struggling to feed their family. It is terrible when they cannot get health care. . . . That is all terrible, and those are all things we need to address, and none of those are addressed just because you are going to subsidize preschool. Why is that the case? Hmm. Can Debra argue otherwise? Debra: Universal full-day preschool creates significant economic savings in health care as well as decreased crime, welfare dependence, and child abuse. Harish: High-quality preschools will reduce crime. Maybe, but so would other measures in terms of crime prevention. Debra: High-quality preschool boosts high school graduation rates. Harish: High-quality preschools can lead to huge improvements in individuals’ lives. Maybe, but I’m not sure if you massively increase the number of people going to preschool, they’re all gonna be the ones going to the high-quality preschools. Uh-oh. Harish is right: there’s a risk that children from the poorest families will end up in the worst preschools. I’m starting to rethink my position. Harish: Even when you subsidize preschools, it doesn’t mean that all individuals go. . . . The question is, who do you help? And the people you don’t help are those individuals who are the poorest. You give unfair and exaggerated gains to those individuals who are in the middle class. Point taken. Since preschool won’t be free, the underprivileged still might not be able to afford it. Now I’m torn about where I stand. You’ve seen arguments from both sides. Before I tell you who won, consider your own position: what was your opinion of preschool subsidies going into the debate, and how many times did you end up rethinking that opinion? If you’re like me, you reconsidered your views multiple times. Changing your mind doesn’t make you a flip-flopper or a hypocrite. It means you were open to learning. Looking back, I’m disappointed in myself for forming an opinion before the debate even started. Sure, I’d read some research on early child development, but I was clueless about the economics of subsidies and the alternative ways those funds could be invested. Note to self: on my next trip to the top of Mount Stupid, remember to take a selfie. In the audience poll after the debate, the number of undecided people was the same, but the balance of opinion shifted away from Debra’s position, toward Harish’s. Support for preschool subsidies dropped from 79 to 62 percent, and opposition more than doubled from 13 to 30 percent. Debra not only had more data, better evidence, and more evocative imagery—she had the audience on her side going into the debate. Yet Harish convinced a number of us to rethink our positions. How did he do it, and what can we learn from him about the art of debate? This section of the book is about convincing other people to rethink their opinions. When we’re trying to persuade people, we frequently take an adversarial approach. Instead of opening their minds, we effectively shut them down or rile them up. They play defense by putting up a shield, play offense by preaching their perspectives and prosecuting ours, or play politics by telling us what we want to hear without changing what they actually think. I want to explore a more collaborative approach—one in which we show more humility and curiosity, and invite others to think more like scientists. THE SCIENCE OF THE DEAL A few years ago a former student named Jamie called me for advice on where to go to business school. Since she was already well on her way to building a successful career, I told her it was a waste of time and money. I walked her through the lack of evidence that a graduate degree would make a tangible difference in her future, and the risk that she’d end up overqualified and underexperienced. When she insisted that her employer expected an MBA for promotions, I told her that I knew of exceptions and pointed out that she probably wouldn’t spend her whole career at that firm anyway. Finally, she hit back: “You’re a logic bully!” A what? “A logic bully,” Jamie repeated. “You just overwhelmed me with rational arguments, and I don’t agree with them, but I can’t fight back.” At first I was delighted by the label. It felt like a solid description of one of my roles as a social scientist: to win debates with the best data. Then Jamie explained that my approach wasn’t actually helpful. The more forcefully I argued, the more she dug in her heels. Suddenly I realized I had instigated that same kind of resistance many times before. David Sipress/The New Yorker Collection/The Cartoon Bank; © Cond? Nast Growing up, I was taught by my karate sensei never to start a fight unless I was prepared to be the only one standing at the end. That’s how I approached debates at work and with friends: I thought the key to victory was to go into battle armed with airtight logic and rigorous data. The harder I attacked, though, the harder my opponents fought back. I was laser-focused on convincing them to accept my views and rethink theirs, but I was coming across like a preacher and a prosecutor. Although those mindsets sometimes motivated me to persist in making my points, I often ended up alienating my audience. I was not winning. For centuries, debating has been prized as an art form, but there’s now a growing science of how to do it well. In a formal debate your goal is to change the mind of your audience. In an informal debate, you’re trying to change the mind of your conversation partner. That’s a kind of negotiation, where you’re trying to reach an agreement about the truth. To build my knowledge and skills about how to win debates, I studied the psychology of negotiations and eventually used what I’d learned to teach bargaining skills to leaders across business and government. I came away convinced that my instincts—and what I’d learned in karate—were dead wrong. A good debate is not a war. It’s not even a tug-of-war, where you can drag your opponent to your side if you pull hard enough on the rope. It’s more like a dance that hasn’t been choreographed, negotiated with a partner who has a different set of steps in mind. If you try too hard to lead, your partner will resist. If you can adapt your moves to hers, and get her to do the same, you’re more likely to end up in rhythm. In a classic study, a team of researchers led by Neil Rackham examined what expert negotiators do differently. They recruited one group of average negotiators and another group of highly skilled ones, who had significant track records of success and had been rated as effective by their counterparts. To compare the participants’ techniques, they recorded both groups doing labor and contract negotiations. In a war, our goal is to gain ground rather than lose it, so we’re often afraid to surrender a few battles. In a negotiation, agreeing with someone else’s argument is disarming. The experts recognized that in their dance they couldn’t stand still and expect the other person to make all the moves. To get in harmony, they needed to step back from time to time. One difference was visible before anyone even arrived at the bargaining table. Prior to the negotiations, the researchers interviewed both groups about their plans. The average negotiators went in armed for battle, hardly taking note of any anticipated areas of agreement. The experts, in contrast, mapped out a series of dance steps they might be able to take with the other side, devoting more than a third of their planning comments to finding common ground. As the negotiators started discussing options and making proposals, a second difference emerged. Most people think of arguments as being like a pair of scales: the more reasons we can pile up on our side, the more it will tip the balance in our favor. Yet the experts did the exact opposite: They actually presented fewer reasons to support their case. They didn’t want to water down their best points. As Rackham put it, “A weak argument generally dilutes a strong one.” The more reasons we put on the table, the easier it is for people to discard the shakiest one. Once they reject one of our justifications, they can easily dismiss our entire case. That happened regularly to the average negotiators: they brought too many different weapons to battle. They lost ground not because of the strength of their most compelling point, but because of the weakness of their least compelling one. These habits led to a third contrast: the average negotiators were more likely to enter into defend-attack spirals. They dismissively shot down their opponents’ proposals and doubled down on their own positions, which prevented both sides from opening their minds. The skilled negotiators rarely went on offense or defense. Instead, they expressed curiosity with questions like “So you don’t see any merit in this proposal at all?” Questions were the fourth difference between the two groups. Of every five comments the experts made, at least one ended in a question mark. They appeared less assertive, but much like in a dance, they led by letting their partners step forward. Recent experiments show that having even one negotiator who brings a scientist’s level of humility and curiosity improves outcomes for both parties, because she will search for more information and discover ways to make both sides better off. She isn’t telling her counterparts what to think. She’s asking them to dance. Which is exactly what Harish Natarajan does in a debate. DANCING TO THE SAME BEAT Since the audience started out favoring preschool subsidies, there was more room for change in Harish’s direction—but he also had the more difficult task of advocating for the unpopular position. He opened the audience’s mind by taking a page out of the playbook of expert negotiators. Harish started by emphasizing common ground. When he took the stage for his rebuttal, he immediately drew attention to his and Debra’s areas of agreement. “So,” he began, “I think we disagree on far less than it may seem.” He called out their alignment on the problem of poverty—and on the validity of some of the studies—before objecting to subsidies as a solution. We won’t have much luck changing other people’s minds if we refuse to change ours. We can demonstrate openness by acknowledging where we agree with our critics and even what we’ve learned from them. Then, when we ask what views they might be willing to revise, we’re not hypocrites. Convincing other people to think again isn’t just about making a good argument—it’s about establishing that we have the right motives in doing so. When we concede that someone else has made a good point, we signal that we’re not preachers, prosecutors, or politicians trying to advance an agenda. We’re scientists trying to get to the truth. “Arguments are often far more combative and adversarial than they need to be,” Harish told me. “You should be willing to listen to what someone else is saying and give them a lot of credit for it. It makes you sound like a reasonable person who is taking everything into account.” Being reasonable literally means that we can be reasoned with, that we’re open to evolving our views in light of logic and data. So in the debate with Harish, why did Debra neglect to do that—why did she overlook common ground? It’s not because Debra is eight years old. It’s because she isn’t human. Debra Jo Prectet is an anagram I invented. Her official name is Project Debater, and she’s a machine. More specifically, an artificial intelligence developed by IBM to do for debate what Watson did for chess. They first dreamed the idea up in 2011 and started working intensively on it in 2014. Just a few years later, Project Debater had developed the remarkable ability to conduct an intelligent debate in public, complete with facts, coherent sentences, and even counterarguments. Her knowledge corpus consists of 400 million articles, largely from credible newspapers and magazines, and her claim detection engine is designed to locate key arguments, identify their boundaries, and weigh the evidence. For any debate topic, she can instantaneously search her knowledge graph for relevant data points, mold them into a logical case, and deliver it clearly—even entertainingly—in a female voice within the time constraints. Her first words in the preschool subsidy debate were, “Greetings, Harish. I’ve heard you hold the world record in debate competition wins against humans, but I suspect you’ve never debated a machine. Welcome to the future.” Of course, it’s possible that Harish won because the audience was biased against the computer and rooting for the human. It’s worth noting, though, that Harish’s approach in that debate is the same one that he’s used to defeat countless humans on international stages. What amazes me is that the computer was able to master multiple complex capabilities while completely missing this crucial one. After studying 10 billion sentences, a computer was able to say something funny—a skill that’s normally thought to be confined to sentient beings with high levels of social and emotional intelligence. The computer had learned to make a logical argument and even anticipate the other side’s counterargument. Yet it hadn’t learned to agree with elements of the other side’s argument, apparently because that behavior was all too rarely deployed across 400 million articles by humans. They were usually too busy preaching their arguments, prosecuting their enemies, or politicking for audience support to grant a valid point from the other side. When I asked Harish how to improve at finding common ground, he offered a surprisingly practical tip. Most people immediately start with a straw man, poking holes in the weakest version of the other side’s case. He does the reverse: he considers the strongest version of their case, which is known as the steel man. A politician might occasionally adopt that tactic to pander or persuade, but like a good scientist, Harish does it to learn. Instead of trying to dismantle the argument that preschool is good for kids, Harish accepted that the point was valid, which allowed him to relate to his opponent’s perspective—and to the audience’s. Then it was perfectly fair and balanced for him to express his concerns about whether a subsidy would give the most underprivileged kids access to preschool. Drawing attention to common ground and avoiding defend-attack spirals weren’t the only ways in which Harish resembled expert negotiators. He was also careful not to come on too strong. DON’T STEP ON THEIR TOES Harish’s next advantage stemmed from one of his disadvantages. He would never have access to as many facts as the computer. When the audience was polled afterward about who taught them more, the overwhelming majority said they learned more from the computer than from Harish. But it was Harish who succeeded in swaying their opinions. Why? The computer piled on study after study to support a long list of reasons in favor of preschool subsidies. Like a skilled negotiator, Harish focused on just two reasons against them. He knew that making too many points could come at the cost of developing, elaborating, and reinforcing his best ones. “If you have too many arguments, you’ll dilute the power of each and every one,” he told me. “They are going to be less well explained, and I don’t know if any of them will land enough—I don’t think the audience will believe them to be important enough. Most top debaters aren’t citing a lot of information.” Is this always the best way to approach a debate? The answer is—like pretty much everything else in social science—it depends. The ideal number of reasons varies from one circumstance to another. There are times when preaching and prosecuting can make us more persuasive. Research suggests that the effectiveness of these approaches hinges on three key factors: how much people care about the issue, how open they are to our particular argument, and how strong-willed they are in general. If they’re not invested in the issue or they’re receptive to our perspective, more reasons can help: people tend to see quantity as a sign of quality. The more the topic matters to them, the more the quality of reasons matters. It’s when audiences are skeptical of our view, have a stake in the issue, and tend to be stubborn that piling on justifications is most likely to backfire. If they’re resistant to rethinking, more reasons simply give them more ammunition to shoot our views down. It’s not just about the number of reasons, though. It’s also how they fit together. A university once approached me to see if I could bring in donations from alumni who had never given a dime. My colleagues and I ran an experiment testing two different messages meant to convince thousands of resistant alumni to give. One message emphasized the opportunity to do good: donating would benefit students, faculty, and staff. The other emphasized the opportunity to feel good: donors would enjoy the warm glow of giving. The two messages were equally effective: in both cases, 6.5 percent of the stingy alumni ended up donating. Then we combined them, because two reasons are better than one. Except they weren’t. When we put the two reasons together, the giving rate dropped below 3 percent. Each reason alone was more than twice as effective as the two combined. The audience was already skeptical. When we gave them different kinds of reasons to donate, we triggered their awareness that someone was trying to persuade them—and they shielded themselves against it. A single line of argument feels like a conversation; multiple lines of argument can become an onslaught. The audience tuned out the preacher and summoned their best defense attorney to refute the prosecutor. As important as the quantity and quality of reasons might be, the source matters, too. And the most convincing source is often the one closest to your audience. A student in one of my classes, Rachel Breuhaus, noticed that although top college basketball teams have rabid fans, there are usually empty seats in their arenas. To study strategies for motivating more fans to show up, we launched an experiment in the week before an upcoming game targeting hundreds of season ticket holders. When left to their own devices, 77 percent of these supposedly die-hard fans actually made it to the game. We decided that the most persuasive message would come from the team itself, so we sent fans an email with quotes from players and coaches about how part of the home-court advantage stems from the energy of a packed house of cheering fans. It had no effect whatsoever: attendance in that group was 76 percent. What did move the needle was an email with a different approach. We simply asked fans one question: are you planning to attend? Attendance climbed to 85 percent. The question gave fans the freedom to make their own case for going. Psychologists have long found that the person most likely to persuade you to change your mind is you. You get to pick the reasons you find most compelling, and you come away with a real sense of ownership over them. That’s where Harish’s final edge came in. In every round he posed more questions to contemplate. The computer spoke in declarative sentences, asking just a single question in the opening statement—and directing it at Harish, rather than at the audience. In his opening, Harish asked six different questions for the audience to ponder. Within the first minute, he asserted that just because preschools are good doesn’t mean that they should be funded by the government, and then inquired, “Why is that the case?” He went on to ask whether preschools were underprovided, whether they did help the most disadvantaged—and then why they didn’t, why they were so costly, and who they actually helped instead. Taken together, these techniques increase the odds that during a disagreement, other people will abandon an overconfidence cycle and engage in a rethinking cycle. When we point out that there are areas where we agree and acknowledge that they have some valid points, we model confident humility and encourage them to follow suit. When we support our argument with a small number of cohesive, compelling reasons, we encourage them to start doubting their own opinion. And when we ask genuine questions, we leave them intrigued to learn more. We don’t have to convince them that we’re right—we just need to open their minds to the possibility that they might be wrong. Their natural curiosity might do the rest. That said, these steps aren’t always enough. No matter how nicely we ask, other people don’t always want to dance. Sometimes they’re so attached to their beliefs that the mere suggestion of getting in sync feels like an ambush. What do we do then? DR. JEKYLL AND MR. HOSTILE Some years ago, a Wall Street firm brought me in to consult on a project to attract and retain junior analysts and associates. After two months of research I submitted a report with twenty-six data-driven recommendations. In the middle of my presentation to the leadership team, one of the members interrupted and asked, “Why don’t we just pay them more?” I told him money alone probably wouldn’t make a difference. Many studies across a range of industries have shown that once people are earning enough to meet their basic needs, paying them more doesn’t stop them from leaving bad jobs and bad bosses. The executive started arguing with me: “That’s not what I’ve found in my experience.” I fired back in prosecutor mode: “Yes, that’s why I brought you randomized, controlled experiments with longitudinal dаta: to learn rigorously from many people’s experiences, not idiosyncratically from yours.” The executive pushed back, insisting that his company was different, so I rattled off some basic statistics from his own employees. In surveys and interviews, a grand total of zero had even mentioned compensation. They were already well paid (read: overpaid), and if that could have solved the problem, it already would have.* But the executive still refused to budge. Finally I became so exasperated that I did something out of character. I shot back, “I’ve never seen a group of smart people act so dumb.” In the hierarchy of disagreement created by computer scientist Paul Graham, the highest form of argument is refuting the central point, and the lowest is name-calling. In a matter of seconds I’d devolved from logic bully to playground bully. If I could do that session over, I’d start with common ground and fewer data points. Instead of attacking their beliefs with my research, I’d ask them what would open their minds to my data. A few years later, I had a chance to test that approach. During a keynote speech on creativity, I cited evidence that Beethoven and Mozart didn’t have higher hit rates than some of their peers; they generated a larger volume of work, which gave them more shots at greatness. A member of the audience interrupted. “Bullsh*t!” he shouted. “You’re disrespecting the great masters of music. You’re totally ignorant—you don’t know what you’re talking about!” Instead of reacting right then, I waited a few minutes until a scheduled break and then made my way to my heckler. Me: You’re welcome to disagree with the data, but I don’t think that’s a respectful way to express your opinion. It’s not how I was trained to have an intellectual debate. Were you? Music man: Well, no . . . I just think you’re wrong. Me: It’s not my opinion—it’s the independent finding of two different social scientists. What evidence would change your mind? Music man: I don’t believe you can quantify a musician’s greatness, but I’d like to see the research. When I sent him the study, he responded with an apology. I don’t know if I succeeded in changing his mind, but I had done a better job of opening it. When someone becomes hostile, if you respond by viewing the argument as a war, you can either attack or retreat. If instead you treat it as a dance, you have another option—you can sidestep. Having a conversation about the conversation shifts attention away from the substance of the disagreement and toward the process for having a dialogue. The more anger and hostility the other person expresses, the more curiosity and interest you show. When someone is losing control, your tranquility is a sign of strength. It takes the wind out of their emotional sails. It’s pretty rare for someone to respond by screaming “SCREAMING IS MY PREFERRED MODE OF COMMUNICATION!” This is a fifth move that expert negotiators made more often than average negotiators. They were more likely to comment on their feelings about the process and test their understanding of the other side’s feelings: I’m disappointed in the way this discussion has unfolded—are you frustrated with it? I was hoping you’d see this proposal as fair—do I understand correctly that you don’t see any merit in this approach at all? Honestly, I’m a little confused by your reaction to my data—if you don’t value the kind of work I do, why did you hire me? In a heated argument, you can always stop and ask, “What evidence would change your mind?” If the answer is “nothing,” then there’s no point in continuing the debate. You can lead a horse to water, but you can’t make it think. THE STRENGTH OF WEAK OPINIONS When we hit a brick wall in a debate, we don’t have to stop talking altogether. “Let’s agree to disagree” shouldn’t end a discussion. It should start a new conversation, with a focus on understanding and learning rather than arguing and persuading. That’s what we’d do in scientist mode: take the long view and ask how we could have handled the debate more effectively. Doing so might land us in a better position to make the same case to a different person—or to make a different case to the same person on a different day. When I asked one of the Wall Street executives for advice on how to approach debates differently in the future, he suggested expressing less conviction. I could easily have countered that I was uncertain about which of my twenty-six recommendations might be relevant. I could also have conceded that although money didn’t usually solve the problem, I’d never seen anyone test the effect of million-dollar retention bonuses. That would be a fun experiment to run, don’t you think? A few years ago, I argued in my book Originals that if we want to fight groupthink, it helps to have “strong opinions, weakly held.” Since then I’ve changed my mind—I now believe that’s a mistake. If we hold an opinion weakly, expressing it strongly can backfire. Communicating it with some uncertainty signals confident humility, invites curiosity, and leads to a more nuanced discussion. Research shows that in courtrooms, expert witnesses and deliberating jurors are more credible and more persuasive when they express moderate confidence, rather than high or low confidence.* And these principles aren’t limited to debates—they apply in a wide range of situations where we’re advocating for our beliefs or even for ourselves. In 2014, a young woman named Michele Hansen came across a job opening for a product manager at an investment company. She was excited about the position but she wasn’t qualified for it: she had no background in finance and lacked the required number of years of experience. If you were in her shoes and you decided to go for it, what would you say in your cover letter? The natural starting point would be to emphasize your strengths and downplay your weaknesses. As Michael Scott deadpanned on The Office, “I work too hard, I care too much, and sometimes I can be too invested in my job.” But Michele Hansen did the opposite, taking a page out of the George Costanza playbook on Seinfeld: “My name is George. I’m unemployed and I live with my parents.” Rather than trying to hide her shortcomings, Michele opened with them. “I’m probably not the candidate you’ve been envisioning,” her cover letter began. “I don’t have a decade of experience as a Product Manager nor am I a Certified Financial Planner.” After establishing the drawbacks of her case, she emphasized a few reasons to hire her anyway: But what I do have are skills that can’t be taught. I take ownership of projects far beyond my pay grade and what is in my defined scope of responsibilities. I don’t wait for people to tell me what to do and go seek for myself what needs to be done. I invest myself deeply in my projects and it shows in everything I do, from my projects at work to my projects that I undertake on my own time at night. I’m entrepreneurial, I get things done, and I know I would make an excellent right hand for the co-founder leading this project. I love breaking new ground and starting with a blank slate. (And any of my previous bosses would be able to attest to these traits.) A week later a recruiter contacted her for a phone screen, and then she had another phone screen with the team. On the calls, she asked about experiments they’d run recently that had surprised them. The question itself surprised the team—they ended up talking about times when they were sure they were right but were later proven wrong. Michele got the job, thrived, and was promoted to lead product development. This success isn’t unique to her: there’s evidence that people are more interested in hiring candidates who acknowledge legitimate weaknesses as opposed to bragging or humblebragging. Even after recognizing that she was fighting an uphill battle, Michele didn’t go on defense or offense. She didn’t preach her qualifications or prosecute the problems with the job description. By agreeing with the argument against her in her cover letter, she preempted knee-jerk rejection, demonstrating that she was self-aware enough to discern her shortcomings and secure enough to admit them. An informed audience is going to spot the holes in our case anyway. We might as well get credit for having the humility to look for them, the foresight to spot them, and the integrity to acknowledge them. By emphasizing a small number of core strengths, Michele avoided argument dilution, focusing attention on her strongest points. And by showing curiosity about times the team had been wrong, she may have motivated them to rethink their criteria. They realized that they weren’t looking for a set of skills and credentials—they were looking to hire a human being with the motivation and ability to learn. Michele knew what she didn’t know and had the confidence to admit it, which sent a clear signal that she could learn what she needed to know. By asking questions rather than thinking for the audience, we invite them to join us as a partner and think for themselves. If we approach an argument as a war, there will be winners and losers. If we see it more as a dance, we can begin to choreograph a way forward. By considering the strongest version of an opponent’s perspective and limiting our responses to our few best steps, we have a better chance of finding a rhythm. CHAPTER 6 Bad Blood on the Diamond Diminishing Prejudice by Destabilizing Stereotypes I hated the Yankees with all my heart, even to the point of having to confess in my first holy confession that I wished harm to others—namely that I wished various New York Yankees would break arms, legs and ankles. . . . —DORIS KEARNS GOODWIN O ne afternoon in Maryland in 1983, Daryl Davis arrived at a lounge to play the piano at a country music gig. It wasn’t his first time being the only Black man in the room. Before the night was out, it would be his first time having a conversation with a white supremacist. After the show, an older white man in the audience walked up to Daryl and told him that he was astonished to see a Black musician play like Jerry Lee Lewis. Daryl replied that he and Lewis were, in fact, friends, and that Lewis himself had acknowledged that his style was influenced by Black musicians. Although the man was skeptical, he invited Daryl to sit down for a drink. Soon the man was admitting that he’d never had a drink with a Black person before. Eventually he explained to Daryl why. He was a member of the Ku Klux Klan, the white supremacist hate group that had been murdering African Americans for over a century and had lynched a man just two years earlier. If you found yourself sitting down with someone who hated you and all people who shared your skin color, your instinctive options might be fight, flight, or freeze—and rightfully so. Daryl had a different reaction: he burst out laughing. When the man pulled out his KKK membership card to show he wasn’t joking, Daryl returned to a question that had been on his mind since he was ten years old. In the late 1960s, he was marching in a Cub Scout parade when white spectators started throwing cans, rocks, and bottles at him. It was the first time he remembers facing overt racism, and although he could justifiably have gotten angry, he was bewildered: “How can you hate me when you don’t even know me?” At the end of the conversation, the Klansman handed Daryl his phone number and asked if he would call him whenever he was playing locally. Daryl followed up, and the next month the man showed up with a bunch of his friends to see Daryl perform. Over time a friendship grew, and the man ended up leaving the KKK. That was a turning point in Daryl’s life, too. It wasn’t long before Daryl was sitting down with Imperial Wizards and Grand Dragons—the Klan’s highest officers—to ask his question. Since then, Daryl has convinced many white supremacists to leave the KKK and abandon their hatred. I wanted to understand how that kind of change happens—how to break overconfidence cycles that are steeped in stereotypes and prejudice about entire groups of people. Strangely enough, my journey started at a baseball game. HATE ME OUT AT THE BALLGAME “Yankees suck! Yankees suck!” It was a summer night at Fenway Park, my first and only time at a Boston Red Sox baseball game. In the seventh inning, without warning, 37,000 people erupted into a chant. The entire stadium was dissing the New York Yankees in perfect harmony. I knew the two teams had a century-long rivalry, widely viewed as the most heated in all of American professional sports. I took it for granted that the Boston fans would root against the Yankees. I just didn’t expect it to happen that day, because the Yankees weren’t even there. The Red Sox were playing against the Oakland A’s. The Boston fans were booing a team that was hundreds of miles away. It was as if Burger King fans were going head-to-head against Wendy’s in a taste test and started chanting “McDonald’s sucks!” I started to wonder if Red Sox fans hate the Yankees more than they love their own team. Boston parents have been known to teach their kids to flip the bird at the Yankees and detest anything in pinstripes, and YANKEES SUCK is apparently among the most popular T-shirts in Boston history. When asked how much money it would take to get them to taunt their own team, Red Sox fans requested an average of $503. To root for the Yankees, they wanted even more: $560. The feelings run so deep that neuroscientists can watch them light up people’s minds: when Red Sox fans see the Yankees fail, they show immediate activation in brain regions linked to reward and pleasure. Those feelings extend well beyond Boston: in a 2019 analysis of tweets, the Yankees were the most hated baseball team in twenty-eight of the fifty U.S. states, which may explain the popularity of this T-shirt: I recently called a friend who’s a die-hard Red Sox fan with a simple question: what would it take to get him to root for the Yankees? Without pausing, he said, “If they were playing Al Qaeda . . . maybe.” It’s one thing to love your team. It’s another to hate your rivals so much that you’d consider rooting for terrorists to crush them. If you despise a particular sports team—and its fans—you’re harboring some strong opinions about a group of people. Those beliefs are stereotypes, and they often spill over into prejudice. The stronger your attitudes become, the less likely you are to rethink them. Rivalries aren’t unique to sports. A rivalry exists whenever we reserve special animosity for a group we see as competing with us for resources or threatening our identities. In business, the rivalry between footwear companies Puma and Adidas was so intense that for generations, families self-segregated based on their allegiance to the brands—they went to different bakeries, pubs, and shops, and even refused to date people who worked for the rival firm. In politics, you probably know some Democrats who view Republicans as being greedy, ignorant, heartless cretins, and some Republicans who regard Democrats as lazy, dishonest, hypersensitive snowflakes. As stereotypes stick and prejudice deepens, we don’t just identify with our own group; we disidentify with our adversaries, coming to define who we are by what we’re not. We don’t just preach the virtues of our side; we find self-worth in prosecuting the vices of our rivals. When people hold prejudice toward a rival group, they’re often willing to do whatever it takes to elevate their own group and undermine their rivals—even if it means doing harm or doing wrong. We see people cross those lines regularly in sports rivalries.* Aggression extends well beyond the playing field: from Barcelona to Brazil, fistfights frequently break out between soccer fans. Cheating scandals are rampant, too, and they aren’t limited to athletes or coaches. When students at The Ohio State University were paid to participate in an experiment, they learned that if they were willing to lie to a student from a different school, their own pay would double and the other student’s compensation would be cut in half. Their odds of lying quadrupled if the student attended the University of Michigan—their biggest rival—rather than Berkeley or Virginia. Why do people form stereotypes about rival groups in the first place, and what does it take to get them to rethink them?

Смотрите похожие фильмы

Информация

Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.

  • Популярные курсы

    Предоставляем для вашего внимания самые эффективные на сегодняшний день бесплатные онлайн курсы для изучения английского языка.

    Подробнее
  • Полезные статьи

    Добивайтесь успехов, умножая свои знания английского языка

    Подробнее
  • Учебники по английскому языку

    Мы предоставляем нашим студентам лучшие учебные материалы

    Подробнее
  • Форум

    Вы найдете ответы на все вопросы и сможете найти единомышленников для совместного изучения

    Подробнее

Полезные ссылки