Tuesday, September 9, 2008

The Monty Hall Problem Explained! Clearly explained!

I finally came across a clear explanation of the "The Monty Hall Problem", explaining why you should always change your pick.















The "Monty Hall Problem" is:
[Y]ou are given the opportunity to select one closed door of three, behind one of which there is a prize. The other two doors hide “goats” (or some other such “non–prize”), or nothing at all. Once you have made your selection, Monty Hall will open one of the remaining doors, revealing that it does not contain the prize. He then asks you if you would like to switch your selection to the other unopened door, or stay with your original choice. Here is the problem: Does it matter if you switch?
The obvious answer is "No", since while it was originally a 1 in 3 change of picking correctly, after Monty opens one of the remaining doors the odds merely improve to 50/50, so it doesn't matter if you switch or not.

The thinking that leads you to that conclusion is in fact wrong, and by switching you actually increase your odds of winning to 2 in 3. The link above gives a nice readable explanation of how this works, but I just want to cheat here and give you an example that I think will help you more quickly grasp how making the switch improves your odds.

Suppose that instead of three doors there were a hundred, so your odds of originally picking the right door would be 1/100. So now Monty has Carol or Vanna or whoever open all but one of the remaining doors, leaving just two: the one you chose and one other. The chance that you originally chose the right door was 1 in a 100, but now that there's only two unseen doors do you really think that your odds have somehow magically improved to 50/50? It's the same principle whether you started with a hundred doors, fifty, twenty, or...three.

So, are you going to switch?

Deal or no deal?

(Unrelated side note: Looking at Carol Merrill's bio I see that she was born about 30 miles from where I grew up. Small world.)

3 comments:

astrobe said...

The best way to convince oneself that it is better to switch is to calculate the average gain with and without the possibility to switch. The average gain is the sum of gains for all possibilities, divided by the number of possibilities. Quickly:

- without switching: one gains 1 in 3 = 33%

- when one always switches:

if one picks the price, you gain 0.
if one picks the goat, you switch and you gain the price (1)
if one picks the "other" goat, you switch and you gain again (1)
2 gains in 3 cases = 66%

- when one never switches:
if one picks the price, one gains 1.
if one picks a goat, Monty picks the second goat and one gains 0.
if one picks the "other" goat, same, one gains 0.
This leaves again with 33% hopes of gain in average.

When one sometimes switches, of course, your average hope of gain is between 33% and 66% - in anycase less than the "always switch" strategy, which is hence the best one.

Feel free to rephrase in plain and correct english.

Anonymous said...

I understand the math behind the monty hall problem. Let me pose another: Suppose you are on Let's make a deal. You pick #1 of 3 doors. Monty opens door 3 and there is a goat. Now you have the chance to switch but I come on the stage and shoot you. Now Monty asks me to pick a door. Do I have an advantage by picking door 3?

Anonymous said...

There's a good explanation of the Monty Hall Problem on this blog, Whiz Think.
http://whizthink.blogspot.com/2011/07/monty-hall-problem.html