Paradoxes and Unplayable Games

Jonathan R. Partington

(Based on a talk given to the Quintics society on October 22nd 1987)

It is often observed that Mathematicians spend a lot of time playing games, and indeed many parts of Mathematics have their foundations in games playing. My object in this talk is to give some examples of games which either can't be played (though that needn't stop you), or if they can be played, will at least cause a few surprises.

One old favourite along these lines is the St Petersburg Paradox. Player A tosses a coin repeatedly until he gets a Head. If it takes him until the n'th toss he pays player B the sum of 2n pounds (or roubles?) A quick example: I toss, get Tails; I toss again, get Heads. So I pay 4 pounds. Right, let's calculate whether we've been lucky or unlucky. What is the expected win? Well, a quick calculation puts it at (1/2).2+(1/22).4+(1/23).8+...=1+1+1+..., which is infinite. How can this be, since you never do win an infinite amount?

This problem puzzled several people in the 19th century. D'Alembert asked Lagrange; Poisson said there was a contradiction because under some circumstances A will be unable to pay his debts; finally Bertrand said that it depended on how many games you play. If you play 2k games, you would expect half of them to finish first time, a quarter to last two moves, and so on until you would expect one to last k moves. As a result your average losses per game are about k, that is, if you play N games the expected cost is of the order log N. This is borne out by experimental evidence.

This leads us to a slogan:

RARE EVENTS DON'T HAPPEN

(or at least they don't happen often).

The same sort of considerations apply in the game called Football Pools, for which no winning strategy is known. There is no point betting on `likely' draws -- i.e. the ones that the newspaper pundits recommend, or the ones that seem probable to you. If you do win, you won't win very much, because other people will have had the same idea. Therefore you should bet on a less likely set of draws and really clean up! Similarly it makes no difference to you if Littlewoods put up their first prize from £1,000,000 to £2,000,000. After all you're not going to win it, are you?

Talking of Littlewoods, the mathematician Prof. J.E. Littlewood (of Trinity College) has a nice game in his ``Miscellany'' which, as he says, does not stand up to serious analysis but may amuse a friendly audience. You have a large supply of cards with numbers on both sides. One is numbered 1 on one side, 2 on the other; the next is numbered 2 on one side, 3 on the other; the next is numbered 3--4; and so on. A card is now drawn at random so that each player sees a different side of it. The one who sees the higher of the two numbers wins. However the rules allow you to veto the game after seeing your number, if you wish (i.e. if you think you are going to lose), in which case it isn't played.

We can show by induction that the game is never played. For if you see a 1, then the other side must say 2, and you veto the game. If you see a 2, then either your opponent sees a 1, in which case he vetoes the game, or else he doesn't veto, in which case he must have seen a 3 and you'd better veto. If you see a 3, then you know your opponent will veto if he sees a 2, so you'd better veto in case he has a 4. And so on (?)

In case you thought this was just a problem to do with having infinitely many cards, consider what happens if the cards only go up to 99/100. The argument works until we get to vetoing on 99. But if you have 100 you would win anyway, so the game still never gets played.

A game with a similar analysis is ``Lucky Pierre''. There are now three players and each chooses a positive integer. The middle one wins (or if there is a tie then we agree a draw). Again we argue: there's no point choosing 1, as that can't win; there's no point choosing 2 as that can't win (for nobody will choose 1!); there's no point choosing 3, ... Of course if players are allowed to co-operate one gets a different game, and I'll be discussing such things later.

Back to two-player games. Players A and B are going to think of a real number, and they're going to alternate in choosing its decimal digits. So, if the number is 0.a1b1a2b2a3..., player A chooses a1, then B chooses b1, then A chooses a2, and so on. Player A wins if the final number lies in some pre-agreed set A, and player B wins if it lies in a set B, the complement of A. (We assume some arrangement by which all the choices are made in a finite time.) For the set A=Q, the set of rational numbers, the analysis is easy (B has only to avoid a countable set of points, so he does them one at a time). For more complicated sets the answer as to who (if anyone) has a winning strategy depends on what axioms of Set theory you assume.

Player A then invites player B to dinner. (This is a game known as Come to Dinner). B makes the excuse ``It is late''. A replies, ``Oh, we've plenty of food in the house''. B replies ``But, I'm talking to the Quintics tomorrow.'' To which A retorts, ``Oh you can prepare the talk tomorrow''... and so forth. A has the object of avoiding giving dinner, and B has the object of getting dinner, but there is a concept of honour involved which encourages the game to go on for several rounds. A wants to say ``Some other time, then,'' but as late as possible, and B wants to say, ``All right, I'll come,'' as late as possible.

With the politeness removed, this game reduces to another first known amongst mathematicians as Finchley Central, and amongst devotees of Radio 4 Comedy as Mornington Crescent. The strategies are isomorphic so I'll just describe the first. Players take turns to name London Underground stations: the first one to say ``Finchley Central'' wins. Again, an opening move of ``Finchley Central'' is too much of a cheat, and you might wish to start with, say, Liverpool Street, when, assuming that your opponent isn't rude enough to reply with Finchley Central, leaves you with a mate on your second move (though you probably would prefer to stall by playing, say, Bank, in the hopes of a more spectacular win later).

As someone in the journal Manifold remarked, this game is not dissimilar to the game World War III. The first person to drop the Bomb `wins', but it might be better to wait a little before doing this...

There are several dubious games that require more than two players. One such is the game of Auctioning a Pound Note (or modern equivalent), which is suitable for when the players are less than alert, e.g. at a party. Players bid for this note, as in a usual auction and the one bidding highest gets it. However the one who bid the second highest sum also has to pay the amount he bid. This game has several phases: when the bidding reaches 50p the players begin to realise that you are going to make a profit. When it reaches £ 1 the winning player is also going to make a loss, and so on.

One very famous game which leads to a paradox is the Prisoner's Dilemma (I'm sure it'll be new to someone!) This takes place in a rough and lawless place, either the sheriff's office in the Wild West, or else the University Court of Discipline at Cambridge. There are two prisoners, who are partners. The gaoler goes to one and says ``Well, we don't have the evidence to convict you, but you'd better confess anyway. If you don't and your pal does, then you'll get 5 years. Even if he doesn't we'll probably give you both a 2 year sentence. On the other hand if you confess and he doesn't, then we'll let you off and throw the book at him. Of course if you both confess we'll still give you 4 years, but it's still better than being made to look a mug by your pal!''

In other words, the payoffs are as follows (yours first):

                     His action
               Cooperates  Defects

You cooperate     -2/-2      -5/0

You defect         0/-5      -4/-4

So whatever he does, it's best for you to defect and you do so. Your partner argues similarly and so you both get sent down for 4 years. Now if instead you had both replied ``I ain't sayin' nuffin'', then you'd have got off with 1 year each. What was wrong with the logic?

This problem is one about which much, possibly too much, has been written. There is a difference here between what happens when you play a one-off game and when you iterate the game.

For instance, see a book by Axelrod, ``The Evolution of Cooperation'', that tells what happened when they launched a tournament for computer programs, each playing each other several times. Clearly if you want to do well in the long term, you ought to be co-operating (with the other prisoner, this is) as much as possible, though you shouldn't get the reputation of being a soft touch. The programs included several with complicated strategies that included trying to analyse the opposing program and out-think it; also there were less subtle programs such as ``Always Defect'' and ``Random''.

The winning program was in fact one called ``Tit for tat''. This one co-operates until its opponent defects, when it then retaliates by defecting on the next move only. Against `nice' programs -- ones which don't defect until they have already been defected against -- it scores maximally, and doesn't allow less nice programs to rip it off.

Another game of co-operation and defection is the Luring Lottery, first launched by Hofstadter in the Scientific American. Here Hofstadter offered $1,000,000 to his readers, at least nominally. They were invited to write in claiming shares in the kitty. They would then receive n/N of a kitty of $1,000,000/N, where n was the number of shares they claimed and N the total number of shares claimed.

In an ideal world, just one or two people would write in, claiming one share, and thus become immensely rich; in fact Hofstadter received some enormous bids (whole postcards full of factorial signs), so nobody got anything.

I ran a similar lottery in Cambridge, changing the rules slightly, and offering rather less money. The results are recorded elsewhere. (I didn't have to pay out much, either.)

A few more curious games to conclude with, that have no connexion with the above.

First, The Lion and the Christian. The lion, L, and the Christian, C, are in a circular arena (unit radius), and L is chasing C with the object of eating him. Both move at unit speed, and are conveniently regarded as points. If L is at the centre, and C stays on the circumference, then it is not hard to show that L catches C in a time of pi/2, travelling on a pursuit curve.

However C can do better (as Besicovitch showed). He moves towards L to start with, then makes a sequence of zig-zag steps at right angles to the direction the lion is from him at the start of each step: if the step lengths are chosen to be (di), where the sum of the (di) is infinite (so he is travelling for infinite time) and the sum of the (di2) is small and finite (so that he doesn't end up on the circumference again), C can evade L indefinitely.

Finally, the game of Undergraduate and Porter (not based on personal research). Here the Undergraduate U who is in the middle of a circular lawn in the middle of a court (cf. New Court, Trinity), is trying to avoid a collection of point porters P1, ..., Pn. The porters are not allowed to walk on the grass, and move round the edge of the lawn. It was shown by Croft that in fact U can avoid a countable sequence of porters by taking a sequence of small zigzags. However if the porters were sensible enough to patrol the sides of a square containing the circle, then 4 of them would be able to catch U (one per side).

This knowledge will, I trust, be found useful: it is surely a branch of Applied Mathematics. However I accept no liability for any losses you sustain in attempting to play this, or any of the games.

Last updated on December 7th 2002 by Jonathan Partington

Hosted by www.Geocities.ws

1