ED vs. COGNITIVE BIASES, PART 3: MISPERCEIVING RISK

Posted in Ed vs. Cognitive Biases on May 12th, 2008 by Ed

Let's play a game, courtesy of Nobel Laureate Daniel Kahneman and Amos Tversky. It's more fun if you answer honestly. You must pick one of these two choices:

A: A sure gain of $250
B: A 30% chance to gain $1000 and a 70% chance to gain nothing

Now pick one of these two:

C: A sure loss of $750
D: A 70% chance to lose $1000 and a 30% chance to lose nothing

Kahneman and Tversky conducted experiments and found that 84% of respondents in the first scenario chose A, whereas in the second 87% chose D. Classic economic theory (expected utility) would suggest choosing B and D. In the first problem, the expected utility would be A = $250 and B = $300 (30% of $1000). In the second problem, C = -$750 and D = -$700. So why do people get it "right" in the second problem but not the first?

Well, Kahneman and Tversky developed Prospect Theory, essentially replacing Expected Utility Theory, winning fame and fortune in the process. I won't pretend to do it justice here; what it essentially means (note that both men are psychologists, not economists) is that people are risk averse when confronted with gains and risk seeking when gambling with losses. We prefer the sure thing, even though it's smaller, when we can gain ("One in the hand is two in the bush") but are willing to gamble a larger loss for the chance to lose nothing. This is why in the stock market, for example, people sell very quickly when their investments are up (~20% profit) but hold onto bad investments for years, riding them to 80-90% losses in some cases, waiting for things to turn around.

And now the point.

Let's take this out of the realm of economic decision-making and into the realm of social issues. Kahneman and Tversky did. In a second experiment, they ask participants to imagine that a new virus attacks Asia and the CDC must prepare for an outbreak in the U.S. which is predicted (assume for a moment that it can be predicted accurately) to kill 600 people. They have two potential plans, and they conduct opinion polling to see how the public will react. The first test subjects saw these two choices:

A: 200 people will be saved
B: A 1/3 chance that all 600 people will be saved but a 2/3 chance that no one will be saved.

A second group saw different options:

C: 400 people will die
D: A 1/3 chance that no one will die and a 2/3 chance that 600 people will die

72% chose A and 78% chose D. But literally nothing has changed. These are the exact same options, worded differently: 200 live and 400 die in A or C, while there is a 2/3 chance that everyone dies in B and D. Regarding social/moral/political questions like this, framing is stupendously important. Although the odds are the same, "200 people will live" triggers the cognitive bias in favor of certainty whereas "400 people will die" activates risk averse thinking.

So, let's apply this to the deeply-held conviction by McCain, Lieberman, and their followers that we should double down in Iraq. What these people are doing is gravitating toward choice "D" in the last problem: a small chance that things will work out perfectly and a large chance that things will go completely to shit and get far worse. This is preferred to "C", which is cutting our current (and more importantly, certain) losses. It doesn't matter that there's only a 5% chance that Iraq will turn into an idyllic paradise of stability. Our cognitive wiring suggests that even a glimmer of hope is enough. The average person will take the Hail Mary pass, risking a huge loss for a miniscule chance at total victory, over a smaller but certain loss any day.