Chapter 3 Summary
Last Updated on October 6, 2022, by eNotes Editorial. Word Count: 545
In chapter 3, Grant outlines a disreputable study conducted by Henry Murray, a psychologist who served as lieutenant colonel for the Office of Strategic Services during World War II. In 1959, Murray recruited Harvard sophomores and allotted them a month to write out their personal ideals and philosophies in life. He then invited these participants to debate their personal philosophies with a law student—whose task, unbeknownst to them, was to launch an aggressive attack on their ideals. While many of the participants described the experiment as traumatic decades later, others looked at it as memorable or even fun. In the following paragraphs, Grant attempts to explore the varied reactions we have to being proven wrong.
He introduces the concept of the totalitarian ego, whose job is to safeguard our identities and self-image when our core beliefs are threatened. Neuroscientists have found that, when these beliefs are challenged, our amygdala’s “fight-or-flight response” is triggered. The totalitarian ego’s response to such destabilization is either preaching, politicking, or prosecuting—or all three. Grant maintains that this traps us in a dangerous cycle of overconfidence: of filter bubbles and echo chambers.
Grant recounts his encounter with Nobel Prize–winning psychologist Daniel Kahneman, who, once proven wrong, responded with genuine enthusiasm. Kahneman had explained that he had only provisional attachment to his ideas—this, Grant points out, is integral to the rethinking cycle. He holds that one must detach one’s present self from one’s past self, as well as detach one’s opinions from one’s self-identity. He concludes that the key to overcoming the totalitarian ego and continually adapting our beliefs, ideas, and ideologies is the prioritization of one’s values over one’s beliefs.
In 2015, Jean-Pierre Beugoms, a regular in international forecasting tournaments, calculated that Donald Trump had a sixty-eight percent chance of winning the 2016 US Republican presidential primary—a prediction that, at the time, floored his opponents. As one of the world’s top election forecasters, what sets Beugoms apart is that he thinks like a scientist, constantly rethinking his ideologies and beliefs. In fact, Grant explains, research has found that one of the most telling indicators of forecasters’ success is the frequency with which they engage in rethinking.
Although Beugoms was able to correctly predict that Trump would win the Republican primary, he fell victim to desirability bias and changed his forecast to Clinton once the presidential elections neared. Instead of being defensive about this decision, Beugoms was able to admit that a Trump win was too personally unpleasant for him to forecast. Grant finds that those who approach being wrong with such levity often have a steadfast commitment to reaching the right answer. Indeed, he explains that the willingness to backtrack and reroute is crucial to avoid being wrong in the long run.
In the last section, Grant returns to one participant of Murray’s Harvard study who called it “a highly unpleasant experience.” Years after, this participant would come to be known as the Unabomber, the domestic terrorist who mailed bombs and published a manifesto against the Industrial Revolution. Grant maintains that such a level of conviction in one’s belief is disturbing and that the Unabomber would have benefitted from the ability to accept that he was in the wrong.