# Monty Hall problem: The probability puzzle that makes your head melt

• 12 September 2013
• From the section Magazine

A reference in a recent Magazine article to the Monty Hall problem - where a contestant has to pick one of three boxes - left readers scratching their heads. Why does this probability scenario hurt everyone's brain so much, asks maths lecturer Dr John Moriarty.

Imagine Deal or No Deal with only three sealed red boxes.

The three cash prizes, one randomly inserted into each box, are 50p, Â£1 and Â£10,000. You pick a box, let's say box two, and the dreaded telephone rings.

The Banker tempts you with an offer but this one is unusual. Box three is opened in front of you revealing the Â£1 prize, and he offers you the chance to change your mind and choose box one. Does switching improve your chances of winning the Â£10,000?

Each year at my university we hold open days for hordes of keen A-level students. We want to sell them a place on our mathematics degree, and I unashamedly have an ulterior motive - to excite the best students about probability using this problem, usually referred to as the Monty Hall Problem.

People just can't seem to wrap their heads around it.

This mind-melter was alluded to in an AL Kennedy piece on change this week and dates back to Steve Selvin in 1975 when it was published in the academic journal American Statistician.

It imagines a TV game show not unlike Deal or No Deal in which you choose one of three closed doors and win whatever is behind it.

One door conceals a Cadillac - behind the other two doors are goats. The game show host, Monty Hall (of Let's Make a Deal fame), knows where the Cadillac is and opens one of the doors that you did not choose. You are duly greeted by a goat, and then offered the chance to switch your choice to the other remaining door.

Most people will think that with two choices remaining and one Cadillac, the chances are 50-50.

The most eloquent reasoning I could find is from Emerson Kamarose of San Jose, California (from the Chicago Reader's Straight Dope column in 1991): "As any fool can plainly see, when the game-show host opens a door you did not pick and then gives you a chance to change your pick, he is starting a new game. It makes no difference whether you stay or switch, the odds are 50-50."

But the inconvenient truth here is that it's not 50-50 - in fact, switching doubles your chances of winning. Why?

Let's not get confused by the assumptions. To be clear, Monty Hall knows the location of the prize, he always opens a different door from the one you chose, and he will only open a door that does not conceal the prize.

For the purists, we also assume that you prefer Cadillacs to goats. There is a beautiful logical point here and, as the peddler of probability, I really don't want you to miss it.

In the game you will either stick or switch. If you stick with your first choice, you will end up with the Caddy if and only if you initially picked the door concealing the car. If you switch, you will win that beautiful automobile if and only if you initially picked one of the two doors with goats behind them.

If you can accept this logic then you're home and dry, because working out the odds is now as easy as pie - sticking succeeds 1/3 of the time, while switching works 2/3 of the time.

Kamarose was wrong because he fell for the deception - after opening the door, the host is not starting a new 50-50 game. The actions of the host have already stacked the odds in favour of switching.

The mistake is to think that two choices always means a 50-50 chance. If Manchester United play Accrington Stanley in the Cup then, with the greatest respect to proud Stanley, it's more likely that United will progress to the next round.

Still not convinced? You are in good company. The paradox of the Monty Hall Problem has been incredibly powerful, busting the brains of scientists since 1975.

In 1990 the problem and a solution were published in Parade magazine in the US, generating thousands of furious responses from readers, many with distinguished scientific credentials.

Part of the difficulty was that, as usual, there was fault on both sides as the published solution was arguably unclear in stating its assumptions. Subtly changing the assumptions can change the conclusion, and as a result this topic has attracted sustained interest from mathematicians and riddlers alike.

Even Paul Erdos, an eccentric and brilliant Hungarian mathematician and one-time guest lecturer at Manchester, was taken in.

So what happens on our university's open days? We do a Monty Hall flash mob. The students split into hosts and contestants and pair up. While the hosts set up the game, half the contestants are asked to stick and the other half to switch.

The switchers are normally roughly twice as successful. Last time we had 60 pairs in 30 of which the contestants were always stickers and in the other 30 pairs always switchers:

• Among the 30 switcher contestants, the Cadillac was won 18 times out of 30 - a strike rate of 60%
• Among the 30 sticker contestants, there were 11 successes out of 30, a strike rate of about 36%

So switching proved to be nearly twice as successful in our rough and ready experiment and I breathed a sigh of relief.

I had calculated beforehand the chances of ending up with egg on my face and the team of 30 stickers beating the 30 switchers. It was a risk worth taking, but one shouldn't play Russian Roulette too often.

Next year we will need something different, perhaps Simpson's Paradox. Imagine that 1% of people have a certain disease.

A diagnostic test has been developed which performs as follows - if you have the disease, the test has a 99% chance of giving the result "positive", while if you do not have the disease, the test has 2% chance of (falsely) giving the result "positive".

A randomly chosen person takes the test. If they get the result "positive", what is the probability that they actually have the disease? The answer, 1/3, is perhaps surprisingly low.