January 6, 2009
The Ultimatum Game Is A Trap
Evolutionary psychology, at a newstand near you.
The Ultimatum Game:
One round only, and anonymously. Player 1 is given a sum of money to divide between himself and the unknown Player 2. Player 2 can either accept or reject the deal; no negotiation, no second chance. If Player 2 rejects the deal, no one gets anything.
What's the right division?
Answer a:
Homo economicus, the self-preserving man, would attempt to maximize his gain. For Player 1 that means offering Player 2 the least possible; for Player 2, it means accepting anything greater than zero, because anything is better than zero.
If we were all playing for monetary gain, then Player 1 would offer 99% for himself, and 1% for the other guy, because 1% is better than nothing to the other guy.
But this deal is usually rejected. In fact, anything less than less than 30% is usually rejected. So monetary gain isn't the only variable here-- people do not always choose for their best economic advantage.
Answer b:
The Economist, an excellent magazine which offers excellent analysis of complex political and economic questions, yet still manages to be on the wrong side of history every single time, explains the now accepted "evolutionary psychology" answer:
(from Darwinism: Why we are, as we are) What is curious about this game is that, in order to punish the first player for his selfishness, the second player has deliberately made himself worse off by not accepting the offer. Many evolutionary biologists feel that the sense of justice this illustrates, and the willingness of one player to punish the other, even at a cost to himself, are among the things that have allowed humans to become such a successful, collaborative species. In the small social world in which humans evolved, people dealt with the same neighbours over and over again. Punishing a cheat has desirable long-term consequences for the person doing the punishing, as well as for the wider group. In future, the cheat will either not deal with him or will do so more honestly. Evolution will favour the development of emotions that make such reactions automatic.
It takes less than a moment's thought to realize this is specious, not to mention wrong. Why is the sacrifice an example of an evolved sense of justice or fairness, and not an example of unevolved envy? Like a child who smashes the toy his brother got for Xmas?
To illustrate this, let's make the pot $10 billion. He keeps $9.99B for himself, $10M for you. Now what? Obviously, you're taking the $10M, fairness, justice and Darwin be damned.
In fact, Player 2 will likely accept $10M no matter what Player 1 gets to keep, even approaching infinity.
Evidently, what matters isn't the relative inequality of the deal, but rather how much money Player 2 gets. If he's paid enough, he doesn't care how unfair the deal actually is.
II.
The question then is: is my explanation right?
Nope.
My counterexample is a trick, and I use it to show the complete and total impossibility of interpreting behavior from hypotheticals. Hypotheticals measure identity-- who you think you are-- not who you actually are, which is your behavior.
I'm going to show you now how Player 2 would reject the $10M, consistently, as consistently as if he was offered $1 from a $100 pot.
If this experiment happened right now, in real life, the second person would more than likely refuse the $10M, because in real life there is a third person in the Game that we are not considering: the experimenter with the original pot of $10B. If such a person has $10B laying around to do this trivial experiment, let alone the money he has to repeat the experiment on other people, then $10M isn't worth anything to anybody.
Don't frown; if this was simply a hypothetical question, then none of the dollar values have any meaning at all, especially at the point of large numbers (what's the hypothetical difference between $10B and $100M?) You're simply asking people," what are your general beliefs about fairness?" We all have a belief in our levels of bravery, honesty, greediness, which we will use to answer the questions. At some big dollar value, we'll believe that it compensates us for our sense of injustice. We're honest enough with ourselves to admit that we'd take the hypothetical $10M-- but that $10M is being compared to your ordinary economic world. But in real life, you would still refuse the deal, because you'd be living in a world of hyperinflation.
III.
What does this all mean?
It means the Ultimatum Game is not a question of behavioral economics, it is a magic trick. Magic tricks play differently to different audiences, and you cannot generalize about how humans respond to this magic trick based on how it plays in Vegas. Worse, you cannot generalize to humans based on how a group of people say they would hypothetically respond.
The Ultimatum Game yields different results in different cultures (a Mongolian group and another using ethnic Russians (Tartars and Yakuts) group both reliably offered 50/50 splits; and the even rejected offers that were in their favor (e.g. 30/70.)). Evolution wouldn't account for this. Are Mongolians a more just people? Do the Tartars have less envy? Or do they suspect that any third person who has the power to set up such a Game should not be completely trusted?
Prior to being seated, subjects were handed a consent form and asked to read it. Our
subjects were loathe to sign anything. They were guaranteed anonymity and they did not
want to leave behind any signature that they felt could be turned over to authorities.
Meanwhile, the Machiguenga people along the Peruvian Amazon, when asked by the experimenters to play "a fun game played for money" were "eager to play." They average offer was a 75/25 split, and almost no one rejected any offer, no matter how low.
The real question in the Game is whether it is worth it to you to play at all. When you reject the offer, you don't get nothing, you are not back where you started, because there is a hidden cost in the act of playing the Game. You'll never know what that cost is, and it will be different for everyone, and at different times. Playing the Game hypothetically does not in any way reflect real Game play.
War is an Ultimatum Game, and winning may be losing and losing may be winning, and individual soldiers are all playing their own version, and anyway, imagining how brave you'll be in the thick of battle does not reassure me.
There's almost an Uncertainty Principle to this Game: observing it changes the outcome. Or, more accurately, you can't know both the "real results" and the "real costs" at the same time.
Let us dispense with the belief that this Game has anything to do with evolutionary psychology, or much else.
(A follow up here.)
30 Comments