In a previous post, I proposed a game called Commons Poker, where all players would win big if all cooperated, but cooperation was not guaranteed and selfish behavior would be rewarded. It was a useful little thought experiment, but would it stand up to rigorous computer simulation?
Luckily, the world doesn’t have to rely on my programming skills to learn the answer to that question.
My game is a more communal version of something called The Prisoner’s Dilemma. The dilemma goes like this: you and your accomplish have been caught in a crime. The evidence against you isn’t strong, but the prosecutor wants a conviction. Here are your choices:
- If you agree to testify against your accomplice, you’ll get a six-month sentence and your accomplice five years.
- You know that the prosecutor has offered you accomplice the same deal, which would get you five years in jail.
- If neither of you testify against the other, you’ll likely get a one year sentence.
What’s your move? Clearly, the best collective outcome is for neither of you to testify against the other. But you have no guarantee that your partner will keep quiet.
The branch of economics called Game Theory gives us tools to figure the probabilities and outcomes in situations like this. And because it’s math and probabilities, Game Theory can be simulated with computers.
For his book The Evolution of Cooperation, Robert Axelrod hosted tournaments featuring computers playing against each other using various stratagems in The Prisoner’s Dilemma. In repeated competitions, one strategy has been shown the consistent winner: Tit for Tat.
Tit for Tat a simple strategy:
- Start by cooperating, that is, making the move that is best for the group.
- On subsequent moves, copy the last move of your opponent.
If we all played this strategy in life, we would all cooperate. That might be a good thing. The moment one person starts not cooperating, though, we would revert to more selfish behavior, at least until someone cooperates again.
Axelrod found that winning strategies had four qualities:
- Nice: they were never the first to not cooperate
- Retaliatory: they met noncooperation with noncooperation
- Forgiving: they rewarded a return to cooperative behavior
- Not Greedy: they didn’t try to score more than their opponent
To me, this is strong reinforcement of our social nature: mathematical proof that we all win when we all cooperate with one another.
As you play through your day, what strategy are you using?
(The Evolution of Cooperation and other books mentioned in this blog are available in the bookstore.)
[…] a previous post, I explored how game theory helped us understand the value and power behind social cooperation, […]
[…] a previous post, I discussed David Axelrod and his book The Evolution of Cooperation. Axelrod used computers and […]