Science & Technology

A Natural Sense of Justice

Three simple games determine when and why humans cooperate.

|

Look around you. If you're reading this, it means that you are embedded in a world-spanning network of cooperation. It's embodied in the computer and the screen you're using. It probably comes from somewhere in Asia and its parts were designed, manufactured, and assembled in factories employing thousands of people scattered across the globe. Its parts are made from minerals and plastics that likely come from all six inhabited continents. The electricity used to power your desktop or laptop streams in through a network of wires connecting you to a power plant again designed, manufactured, and fueled by thousands of people that you'll never meet.

We take these vast interlocking cooperative networks more or less for granted. But how could all of this cooperation evolved? Natural selection is all about competition. You know: "Nature, red in tooth and claw." It turns out that the willingness to beat up cheaters and fraudsters is the key to fostering cooperation.

As a fascinating new article in Science notes: "The scale of cooperation in both contemporary and past human societies remains a puzzle for evolutionary biology and social sciences, because, first neither kin selection nor reciprocity appears to readily explain altruism in very large groups of unrelated individuals."

Evolutionary biologists have devised explanations for the levels of altruism and reciprocity found among some creatures. For example, natural selection can select for genes that encourage relatives to help one another. If I sacrifice a bit of food to relatives or defend them from attack, I am helping some of my genes survive. Genes simply want to replicate, and in this case genes that encourage me to sacrifice for a relative helps them to do that. This kind of kinship altruism is well known.

Also some animals engage in reciprocity—literally, you scratch my back and I'll scratch yours. For example, it turns out that some vampire bats will share a blood meal with other bats in their colonies. However, it is very hard to come across examples of animals engaging in much reciprocity with non-kin.

Reciprocal altruism is not selflessness—one organism provides a benefit to another fully expecting eventual reciprocation. However, evolutionary biologists struggle to square the existence of altruistic genes with the difficulty of spreading them. A population of cooperators (chumps) who share food would be out-reproduced by non-cooperators (cheaters) who eat and run. One possible explanation for the large size of human brains is that we need them in order to keep tabs on the cooperators and cheaters amongst us. That works very well for relatively small communities where everyone's reputation precedes them. Yet we live in a world filled with cooperating strangers. How can that be?

Game theory researchers have long noted that in order sustain networks of cooperation there also has to be some way for at least some cooperators to punish defectors. The new study in Science (not available online) by Emory University anthropologist Joseph Heinrich and his colleagues looked at 15 different small scale societies located in Africa, South America, Asia and Oceania to see if this game's theoretic insight has some empirical basis. Researchers enrolled members from these societies in various behavioral experiments designed to test for their willingness to punish. They adapted three economic games, usually administered to undergraduates, in which participants decide how to divvy up a sum of money. To make it interesting, the researchers pegged the stake of each game to one day's wage in the local economy.

The first experiment is the ultimatum game, in which two anonymous players are allotted a sum of money in a one-time interaction. The first player can offer a portion of the sum in 10 percent increments to the second player or not. The second player must decide in advance what amount he will accept from the first player. If the first player offers less than the second player will accept, both go home with nothing. The "rational" thing for the second player to do is accept any positive offer.

However, it doesn't work that way—in all societies some players would refuse offers they regarded as too low and go home with nothing, happy in the knowledge that the cheapskate on the other side went home empty-handed too. Overall, 56 percent of second players rejected 10 percent offers. However, there was considerable variation between societies—there were five in which only 15 percent of the players would reject 10 percent offers and four in which 60 percent of players would reject 10 percent offers. Still, the researchers found a "universal pattern, with an increasing proportion of individuals from every society choosing to punish as offers approach zero."

A second economic experiment was a third party punishment game in which a sum of money is allotted to player one and player two and a third player who gets one-half what the others have been allotted. Player one must decide what portion of the stake to give to player two who makes no decisions. Meanwhile player three must decide in advance whether he wants to punish player one should he turn out to be stingy. Thus player three can pay 20 percent of his stake to reduce player one's take by 30 percent from the amount he decides to keep. For example, if player one decided to give only $10 of a $100 stake to player two, keeping $90 for himself and player three has decided already to punish such mingy behavior, then player one would take home $60, player two would get $10, and player three would walk away with $40. Had player three chosen not to punish, the take-home amounts would have been $90, $10, and $50, respectively. It's no skin off player three's nose if player one stiffs player two, yet the researchers found that overall, two-thirds of player threes would pay 20 percent of their endowment to punish player ones who allocate nothing to player two. This varied by society, with a low of 28 percent in one and a high of 90 percent in two others.

Finally, the researchers conducted a third game called the dictator game—exactly like the one-shot anonymous ultimatum game except that player two can't reject what is offered. Player one walks away with whatever he wants to keep. Again, the "rational" thing for player one to do is walk away with all the money—he doesn't know the guy in the next room and this deal will never present itself again. The researchers argue that the dictator game provides "a kind of behavioral altruism that is not directly linked to kinship, reciprocity, reputation, or immediate threat of punishment." What the researchers were trying to do was figure out just how altruistic each society was.

The results are intriguing. It turns out that the societies in which the player ones in the dictator game were willing to give more to the player twos are also the societies in which people were more willing to punish less generous players in the other two games. In other words, societies that punished strongly were also the most likely to have strong altruistic impulses. The moral of the story is that if you want to live in a world of caring generous cooperative people, make sure that you thoroughly thrash all the greedy, chiseling scoundrels you come across. It may cost you, but the world will be a better place.