< back  |

NICE GUYS FINISH FIRST

Robert Axelrod, a political scientist, became fascinated by a simple game called prisoner’s dilemma but its simplicity is deceptive and shelf loads of books have been written about it. The prisoner's dilemma has attracted not only games theorists and evolutionists, but military strategists as well. Many people think it holds the key to strategic defence planning and that we should study it to prevent another major world war. Our version is as follows:

          THE PRISONER'S DILEMMA


Adam and Eve are in isolation cells, suspected of a significant crime. ‘The Governor’ visits them in turn, inviting each to grass on the other. Neither knows what the other has done. There are four possible outcomes.

Adam: Eve did it, and I’ve got proof. BETRAYAL (of Eve)
Eve: I’m not saying anything. CO-OPERATION (with Adam)
Adam is released. Eve gets the maximum sentence

Adam: I’m saying nothing. CO-OPERATION (with Eve)
Eve: It was Adam who did it. BETRAYAL (of Adam)
Eve is released. Adam gets the maximum sentence

Adam: Eve did it, and I’ve got proof. BETRAYAL (of Eve)
Eve: It was Adam who did it; I’ve got proof. BETRAYAL (of Adam)
Both receive medium sentences, rewarded for giving evidence

Adam: I’m saying nothing. CO-OPERATION (with Eve)
Eve: I’m not saying anything. CO-OPERATION (with Adam)
Both receive negligible punishment, through lack of evidence

The prisoners’ respective self-interest dictates that both see betrayal as a way out; but both are therefore likely to betray and suffer. If both co-operated they would suffer less. ‘The Governor’ hears them muttering ‘You bastard!’, or perhaps, ‘If only I’d known you’d say nothing!’ – but trust in these circumstances is hard to come by.

Translated into a two-card ten-round gambling game, the dilemma is more marked.

A co-operates

B co-operates

Banker gives both a modest sum

A sells out

B co-operates

Banker gives A a large sum and B a fine

A sells out

B sells out

Banker takes a small fine from both

A co-operates

B sells out

Banker gives B a large sum and A a fine


A bright player will quickly see that it will pay to play the ‘Sell Out’ card every time. If the opponent is bright too, they’ll both play ‘Sell Out’ until such time as they can swallow hard and play ‘Co-operate’ – hard to do when you know that if your opponent hasn’t played that card he will be in the money.

Games theorists set to work with their computers to see what strategies might be devised to crack the dilemma. They enjoyed themselves co-operating in this enterprise. Since the idea was to win, the result was unexpected: the most successful strategy IF ENOUGH ROUNDS WERE PLAYED was one in which the player (a computer)

a. was never the first to play ‘Sell Out’
b. quickly forgot that the opponent had played ‘Sell Out’ (so no continuing reprisals)
c. did not care if it won fewer points/less money than its opponent
d. co-operated with the other player to fleece the ‘bank’ rather than each other

The informal truces in the First World War are a clear example of this kind of reciprocity. The soldiers co-operated in the interests of everyone’s survival, and against the interests of their commanders (or ‘war’ itself).
Biologists point to many examples of species that work together on the same principle that co-operation is best for all: the price for it is lower than the cost of competition.

The popular view if that ‘nice guys’ finish last. Whether it is in the jungle or the ‘real’ world the aggressive go-getters always win, always come first whilst those that co-operate - the nice guys - get left behind. Axelrod devised a series of computer tournaments to test this and invited people to submit computer programmes to play the game. Unsurprisingly the majority were based on the assumption that selfishness as opposed to co-operation will win. As it happens the opposite was true. The co-operative strategy the longer each game was played paid off. The length of play (or involvement in a ‘real life’ situation is important)

Axelrod called this the ‘shadow of the future’ and draws a moving illustration of its importance from a remarkable phenomenon that grew up during the First World War, the so-called live and let-live system.

While many know about the brief fraternisation at Christmas between British and German troops in no-man's land, less well known is the fact that unofficial and unspoken nonaggression pacts, a 'live and let live' system, flourished all up and down the front lines until at least 1917. A senior British officer, on a visit to the trenches,was astonished to observe German soldiers walking about within rifle range behind their own line. 'Our men appeared to take no notice. I privately made up my mind to do away with that sort of thing when we took over; such things should not be allowed. These people evidently did not know there was a war on. Both sides apparently believed in the policy of "live and let live".'

The theory of games and the Prisoner's Dilemma had not been invented in those days but, with hindsight, we can see pretty clearly the entrenched warfare of those times, the shadow of the future for each platoon was long. That is to say, each dug-in group of British soldiers could expect to be facing the same dug-in group of Germans for many months. The shadow of the future was quite long enough, and indeterminate enough, to foster the development of a Tit for Tat type of cooperation. Provided, that is, that the situation was equivalent to a game of Prisoner's Dilemma.

To qualify as a true Prisoner's Dilemma, the payoffs have to follow a particular rank order. Both sides must see mutual cooperation as preferable to mutual defection. Defection while the other side cooperates is even better if you can get away with it. Cooperation while the other side defects is worst of all. Mutual defection is what the general staff would like to see. They want to see their own chaps, keen as mustard, potting Jerries or Tommies whenever the opportunity arises.

Mutual cooperation was undesirable from the generals' point of view, because it wasn't helping them to win the war. But it was highly desirable from the point of view of the individual soldiers on both sides. They didn't want to be shot. Admittedly - and this takes care of the other payoff conditions needed to make the situation a true Prisoner's Dilemma - they probably agreed with the generals in preferring to win the war rather than lose it. But that is not the choice that faces an individual soldier. The outcome of the entire war is unlikely to be materially affected by what he, as an individual, does. Mutual cooperation with the particular enemy soldiers facing you across no-man's-land most definitely does affect your own fate, and is greatly preferable to mutual defection, even though you might, for patriotic or disciplinary reasons, marginally prefer to defect (DC) if you could get away with it. It seems that the situation was a true prisoner's dilemma. Something like Tit for Tat could be expected to grow up, and it did.

You can find out more about these issues in the following:
The Evolution of Co-Cooperation. R Axelrod (Basic Books)
The Selfish Gene. Richard Dawkins (chapter 11 deals with these issues)
Winners and other Losers in Peace and War. Arnold Arnold (Paladin)

| live and let live >