In the 1970s, two Israeli psychologists at the Hebrew University of Jerusalem, Amos Tversky and Daniel Kahneman, performed an experiment by asking sets of two slightly differently worded hypothetical questions to large sample groups. Their most famous set of questions goes like this:
The United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill six hundred people. Two different programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program A is adopted, two hundred people will be saved. If program B is adopted, there is a one-third probability that six hundred people will be saved and a two-thirds probability that no people will be saved. Which of the two programs would you favor?
When this question was tested with a large number of physicians 72% chose program A, and only 28% chose program B. This is to say that the majority of physicians would rather save a certain number of people for sure rather than risking the possibility of saving no one. However, consider the following adjustments to the above question:
If program C is adopted, four hundred people will die. If program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that six hundred people will die. Which of the two programs would you favor?
When asked this version of the question, even though programs C and D are identical to A and B, 22% chose option C and 78% chose option D. Physicians completely reversed their previous decisions, rejecting a guaranteed gain in order to participate in a risky gamble.
The options are the same, yet the sampled group responded very differently depending on how the two options were presented. When the question was described in terms of survivors the physicians made the rational choice and went with the safe strategy, but when the two programs were presented by describing deaths, physicians were suddenly willing to take risks to avoid four hundred deaths, even though it is the same as saving two hundred lives.
What this demonstrates is a concept called loss aversion. In short, the pain of a loss is more powerful than the pleasure of a gain. We are willing to go to great lengths to avoid losses –much farther than we’ll go to get an equivalent gain.
Loss aversion is an innate flaw rooted in the emotional brain. When facing a situation where loss seems imminent we experience strong negative emotions which compel us to make irrational decisions to try and avoid the loss at almost all costs. You might see loss aversion used in marketing and advertising to exploit the emotions of consumers. For example, an offer may describe avoiding a $10 surcharge rather than gaining a $10 credit.
So how can we avoid being tricked into making illogical decisions by loss aversion? Everyone who experiences emotions is vulnerable to its effects – the only way to avoid loss aversion is to know about the concept. (“Knowing is half the battle!”)
The Psychology of Games blog is awesome.
Yes, I should note that the majority of information for this post was collected for these two sources:
The Psychology of Video Games, a great blog which connects concepts of psychology and neuroscience to video games.
http://www.psychologyofgames.com/
And…
How We Decide, by Jonah Lehrer, a fantastic book about conscious and unconscious decision making.
http://www.jonahlehrer.com/books-1
Comments are closed.