Right Now | Cognitive Biases
Willing to War
Our minds favor hawks over doves, argues a recent Foreign Policy article. Past psychological research has shown that when it comes to the question of war, cognitive biases in the way people process information and evaluate risk predispose political leaders to military action over diplomatic solutions. According to the article’s authors—graduate student in government Jonathan Renshon, author of Why Leaders Choose War: The Psychology of Prevention, and Daniel Kahneman, professor of psychology at Princeton—such impulses “incline national leaders to exaggerate the evil intentions of adversaries, to misjudge how adversaries perceive them, to be overly sanguine when hostilities start, and overly reluctant to make necessary concessions in negotiations. In short, these biases have the effect of making wars more likely to begin and more difficult to end.”
Our “aversion to certain losses” offers a prime example cited in the article. Given a choice between (a) a sure loss of $890, or (b) a 10 percent chance of no loss with a 90 percent chance of a $1,000 loss, most of us will gamble on (b). The sure loss may be the better bet, but it triggers what psychologists call a “predictable error” in how we interpret situations. “The Iraq debate is a great real- world corollary,” explains Renshon. “It tends to be framed as, ‘Well, if we pull out, we’re definitely losing and we’ll have to admit defeat. But if we send in more troops, we still have the possibility of some sort of victory in the end.’” For those in favor of more troops, the potential payoff is worth the greater degree of risk.
Because most people process gains and losses differently, cognitive bias also influences military negotiations. According to the findings of prospect theory (for which Kahneman received a Nobel Prize in economics), our aversion to a loss of $10 is greater than the happiness we expect from a gain of $10. “It has around 2.5 times more impact psychologically,” notes Renshon: to compensate for the subjective impact of a $10 loss requires a gain of $25. If you’re negotiating for missile reductions, he explains, “An equivalent reduction on both sides won’t seem fair to either side, because the loss of 100 missiles has more impact than the comparative gain that you feel from the other side cutting their missiles by 100.” The resultant “concession aversion” makes negotiations much more difficult.
One of the most common cognitive failures is also the most pernicious in conflict situations. People attribute the behavior of others to some cause, either to the context or their disposition, says Renshon. Yet “even when you know for a fact that behavior is dictated by situational constraints, you still tend to overattribute behavior to disposition.” To illustrate, he cites a famous 1960s experiment involving three groups of students: the first group was required to give pro-Castro speeches, the second group to give anti-Castro speeches, and the third to act as the audience. Asked to assess the real political views of the speakers, audience members overwhelmingly rated pro-Castro speakers as more leftist—even though they knew that their views were randomly assigned.
This bias is so widespread and so empirically “robust” that social psychologists have dubbed it the fundamental attribution error. “In situations when you don’t know for sure what the behavior is caused by,” says Renshon, “this bias is even more pronounced.” During tense interactions between representatives of foreign governments, for instance, policymakers often attribute the aggressive behavior of the other side to deep hostilities, while excusing their own provocations as the result of being “pushed into a corner.” Renshon and Kahneman point to the historic example of World War I, noting that “the leaders of every one of the nations that would soon be at war perceived themselves as significantly less hostile than their adversaries.”
The same leaders predicted quick and easy victory. Such overconfidence, explains Renshon, arises from three optimism biases: exaggerating one’s strengths, possessing an “illusion of control,” and forming an unrealistic view of probable outcomes. In the more recent case of Iraq, “It’s almost inarguable that there was an excessive amount of optimism in the prewar planning,” says Renshon. He and Kahneman call the initial claims that the war would be a “cakewalk” “just the latest in a long string of bad hawkish predictions” in U.S. military history.
“The ultimate goal, obviously, is to help people make better decisions,” says Renshon. But “de-biasing” has had only intermittent success. One experiment at Harvard Business School compared two groups doing negotiation simulations. The group that was informed about overconfidence exhibited less of that bias and did much better than the group that wasn’t informed. But an experiment that involved making people more aware of the fundamental attribution error, notes Renshon, “had the effect of making the bias more pronounced.”
Should cognitive bias make us more pessimistic about the prospect of peace? “We’re not saying that war is the default option,” Renshon makes clear. “Rivalries end—even long rivalries—and not every crisis ends in a war. We don’t think that cognitive biases are the main explanation for decisions to go to war, or the main explanation for others to make peace or negotiate. Beliefs, values, material constraints, ideology, national values, image—all of these things have a very important effect in policymaking. Thankfully,” he concludes, “cognitive biases don’t decide. People decide.”
~Harbour Fraser Hodder
Jonathan Renshon e-mail address: email@example.com