View Full Version : Quirky math.

Indon

2007-07-09, 11:50 AM

In the spirit of the 2=1 thread, I present to you a perfectly logical and yet odd bit of probability.

You're on a game show and given the choice of three doors. Behind one of the doors is a big prize (be it a new car, luxury getaway, lifetime supply of xp-in-a-can, whatever) and behind the other two are hungry velociraptors (it's a japanese game show).

After picking a door, the game show host (from behind his raptorproof shield) opens up one of the _raptor_ doors, and trained handlers drag the raptor off. Now there are only two doors, and the host asks if you want to stick with your old choice, or switch to the remaining door.

The question: What impact would switching doors have on your chances to get the prize?

Answer and reference:

It increases your chances from 33% to 66%.

Applicable link enclosed. (http://mathworld.wolfram.com/MontyHallProblem.html)

Post other math quirks on this thread, we'll start something that can get anyone's head scratching.

I'm da Rogue!

2007-07-09, 05:39 PM

I can't remember of any right now, but I saw this thread dangerously rollong to the bottom of the page and I thought it's a really nice topic to be forgotten :smallfrown:

Come on people, give us your best!

Lucky

2007-07-09, 05:43 PM

That's wrong.

Doesn't it increase your chances to 50% since you now only have two choices, the third having been eliminated? Apologies for wordiness, I'm a Classics major and have been translating Greek all day. Also apologies if I'm completely off.

Cheers,

Syka

Lucky

2007-07-09, 05:58 PM

It doesn't increase anything by re-guessing. After the first door with the raptors is opened, your chances have increased to 50%. Changing will keep it at 50%. No improvement.

It's like flipping a coin twice. You flip a coin, it lands on heads. Now you flip another, what are the chances of it being tails? Would it not improve since you flipped heads last time?

No, it would not. It would still be 50%.

The probability of guessing right is equal to the number of right choices available, divided by the number of choices, time 100%.

1/2 x 100%= 50%

That's how it works.

I'm da Rogue!

2007-07-09, 05:59 PM

That's wrong.

Ahhh...

OK. I'll write something. It's 1.53 in the morning here, but OK.

Riddle:

You have three vases: one vase containing two white pearls, one vase containing one white and one black pearl, and one vase containing two black pearls. From one of these vases, a pearl is taken. This pearl turns out to be white. What is the probability that the other pearl in the same vase is also white?

The answer:There are three pearls that can be the white pearl that was taken from the chosen vase:

* The first pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The second pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The white pearl from the vase with one white and one black pearl: in this case, the other pearl is black.

The probability that the other pearl in the same vase is also white, is therefore 2/3.

The Great Skenardo

2007-07-09, 05:59 PM

Actually, it's right. I can sort of put together an intuitive proof for you to follow along, if you like.

There are only three configurations the doors can be in:

(:smallsmile: = Quesadillas n' Root Beer

:smalleek: = Raptors)

You can have

1__2__3

:smallsmile: :smalleek: :smalleek:

:smalleek: :smallsmile: :smalleek:

:smalleek: :smalleek: :smallsmile:

and no others, right?

Let's say you pick door #3, initially. The host reveals that Door #1 has a raptor behind it. Now, this tells you that there are two possible configurations, and it's this that gives the impression that you have 50/50 chances. Two options, so an equal chance, right?

Well, not quite. If you pick a door at random at the beginning, you have a 2/3 chance of picking a raptor, right? If you select a raptor, then the host must open the door revealing the last raptor, meaning the other door has the quesadillas and root beer behind it. If you switch doors in these two cases, you get the food!

If, however, you pick the correct door right off (a 1/3 chance), then the host reveals a raptor, but the other one is still hidden. If you switch, you will get it and dies.

Make sense?

Skippy

2007-07-09, 06:00 PM

No, it doesn't... There are two doors now, but remember that those doors are part of a set of three. The possibilities now are 2/3. When you had all three doors you only had 1/3 chance to get the right door...

Lucky

2007-07-09, 06:09 PM

Actually, it's right. I can sort of put together an intuitive proof for you to follow along, if you like.

There are only three configurations the doors can be in:

(:smallsmile: = Quesadillas n' Root Beer

:smalleek: = Raptors)

You can have

1__2__3

:smallsmile: :smalleek: :smalleek:

:smalleek: :smallsmile: :smalleek:

:smalleek: :smalleek: :smallsmile:

and no others, right?

Let's say you pick door #3, initially. The host reveals that Door #1 has a raptor behind it. Now, this tells you that there are two possible configurations, and it's this that gives the impression that you have 50/50 chances. Two options, so an equal chance, right?

Well, not quite. If you pick a door at random at the beginning, you have a 2/3 chance of picking a raptor, right? If you select a raptor, then the host must open the door revealing the last raptor, meaning the other door has the quesadillas and root beer behind it. If you switch doors in these two cases, you get the food!

If, however, you pick the correct door right off (a 1/3 chance), then the host reveals a raptor, but the other one is still hidden. If you switch, you will get it and dies.

Make sense?

Hmm. So it seems I am wrong. Very well explained good sir! Have a cookie!

@V Simu-pwned 3 times over. :smalltongue:

Silkenfist

2007-07-09, 06:19 PM

It doesn't increase anything by re-guessing. After the first door with the raptors is opened, your chances have increased to 50%. Changing will keep it at 50%. No improvement.

It's like flipping a coin twice. You flip a coin, it lands on heads. Now you flip another, what are the chances of it being tails? Would it not improve since you flipped heads last time?

No, it would not. It would still be 50%.

The probability of guessing right is equal to the number of right choices available, divided by the number of choices, time 100%.

1/2 x 100%= 50%

That's how it works.

Uhmm...no. And I am willing to bet you large amounts of money on it.

Long explanation

The problem is, that you are ignoring one part of the game show. It is correct, that in the final situation there are two doors with one prize and one raptor. If you would not have been given any information earlier, it would come down to the flip of a coin. But - fortunately - you have been given information. You have been shown one door that is not a raptor.

How does this affect the game? It becomes evident, if you rephrase the problem. You have one door that has been chosen earlier. Now if there are only two options, switching from one option to the other one will inverse your chances.

You have chosen your door at the beginning of the show, when there were still three options present - two raptors, one prize. Choosing correctly is - under those circumstances - 1/3. After you have chosen your door, the host opens another door and shows a raptor. Note, that this doesn't affect the chance of your current door being the correct one (tricky bit). The point is, that the game host will ALWAYS show you a raptor, since he will ALWAYS have a door with a raptor left. There is no case whatsoever, when - at this point - he will not be able to reveal a raptor.

Previously the chances were 1/3 "I have the correct door" and 2/3 "The correct door is one of the two others"

After revealing a raptor, this has not changed. The chances are still 1/3 "I have the correct door" and 2/3 "One of the others is correct"

Now we return to the final stage of the game: Two doors left. But the choice of doors left is already affected by the earlier part of the game. The door, you had chosen earlier has a chance of 1/3 to be correct. The other one has an inverse chance, read: 2/3.

Short explanation (or if you are still not believing me):

We have three doors: A, B and C. Let's say, C has the prize and A&B have raptors. Now we start the game show and bring in one candidate. Let's say, he chooses his door randomly.

1/3 chance, he chooses door A

1/3 chance, he chooses B

1/3 chance, he chooses C

Now let's calculate his winning chances if he chooses to keep his previous option.

IF he chose A earlier, the host opens B and reveals a raptor. Our candidate will keep his guess and be ghastly devoured.

IF he chose B, the host opens A. The candidate keeps his guess and is once again eaten

IF he chose C, the host opens either A or B. Either way, the candidate keeps his guess and gets the prize.

Each of these three events has an equal probability - 1/3. In one case, the strategy "keep" wins the prize, in two cases, it loses. Thus, keeping your door has a success rate of 1/3 exactly. Choosing the other door, accordingly, has a chance of 2/3

The Great Skenardo

2007-07-09, 06:34 PM

Mmm....cookie ^_^

Gygaxphobia

2007-07-09, 06:49 PM

Ahhh...

OK. I'll write something. It's 1.53 in the morning here, but OK.

Riddle:

You have three vases: one vase containing two white pearls, one vase containing one white and one black pearl, and one vase containing two black pearls. From one of these vases, a pearl is taken. This pearl turns out to be white. What is the probability that the other pearl in the same vase is also white?

Surely not! If you take the white pearl out and do not replace it, you have to have a 50:50 chance.

Pyrian

2007-07-09, 06:59 PM

Most of this "math" is so chock full of undeclared assumptions that no real correct answer can be given.

Let's take the gameshow example. In the scenario as presented, on no account should you switch your choice! You guessed right the first time. You know this is true because if you had guessed a raptor, they'd've simply let it out and you'd be pet food by now. The host is only trying to convince you to switch because he knows you picked right on the first try.

Again with the pebbles, if the pebble were picked randomly and turned out to be white, or if a random white pebble were picked, then yes, there's a 2/3 chance the other pebble in same vase is white. On the other hand, if a white pebble was selected from a random white-pebble-containing vase, then the odds are 50%.

Starla

2007-07-09, 07:00 PM

I am a little confused by the original post. You are saying that the gameshow host had you pick a door first. Then opened one of the other doors? Then he gives you a chance to switch?

Okay the vase one is more clear. 3 vases and one white pearl was removed. Okay that excludes the vase with 2 black since it would have 0 white to remove. Now that would leave the other 2 vases and one had 2 white and the other had a white and a black. SO it is a 50% percent chance that the second pearl is also white... That is my final answer. If I am wrong give me a high school teacher's explanation, because the others were hard to follow.

I think High school trig was the last time I remember looking at ratios.

This is fun. Do another one.

The Great Skenardo

2007-07-09, 07:05 PM

@Starla:

This is the famous "Monty Haul" problem, in which one door hides a fabulous prize, and the other two hide lesser prizes. The contestant picks a door. Then, the host reveals one of the unpicked doors to have the lesser prize. You then have the option of switching your choice to the other unopened door.

The Host always reveals a lesser prize.

MeklorIlavator

2007-07-09, 07:08 PM

Most of this "math" is so chock full of undeclared assumptions that no real correct answer can be given.

Let's take the gameshow example. In the scenario as presented, on no account should you switch your choice! You guessed right the first time. You know this is true because if you had guessed a raptor, they'd've simply let it out and you'd be pet food by now. The host is only trying to convince you to switch because he knows you picked right on the first try.

The question actually has a real-world birthplace. It's from the gameshow Let's Make a Deal, which had rule that prevented killing the contestants, so the problem actually works.

Silkenfist

2007-07-09, 07:38 PM

I am a little confused by the original post. You are saying that the gameshow host had you pick a door first. Then opened one of the other doors? Then he gives you a chance to switch?

Okay the vase one is more clear. 3 vases and one white pearl was removed. Okay that excludes the vase with 2 black since it would have 0 white to remove. Now that would leave the other 2 vases and one had 2 white and the other had a white and a black. SO it is a 50% percent chance that the second pearl is also white... That is my final answer. If I am wrong give me a high school teacher's explanation, because the others were hard to follow.

I think High school trig was the last time I remember looking at ratios.

This is fun. Do another one.

Oh Thor, please let me be not simu'd on this one....

I'll contradict once again. The chance is 2/3. How so?

Let's alter the situation slightly to make it more transparent. I am writing letters on the pearls, naming the white ones A, B and C and the black ones X, Y and Z. Then, I put on a blindfold, shuffle them around and place them in the vases as in the pattern written above.

Now I shuffle the vases around and draw one pearl from any of them. I remove the blindfold and see: It is the white pearl A. Let's have a look at the possible options now. There are three possibilities for me to distribute two white pearls in one vase:

1/3 chance of A&B being in one vase, C in the other one.

1/3 chance of A&C being in one vase, B in the other one.

1/3 chance of B&C being in one vase, A in the other one.

In have picked A, so in the first two cases I will draw another white pearl. Only in the third case, A shares a vase with a black pearl. Thus, the probability of drawing a white pearl again is 2/3

Pyrian

2007-07-09, 08:31 PM

Let's alter the situation slightly to make it more transparent.And, y'know, get the answer you're looking for. :smallcool:

waffletaco

2007-07-09, 11:29 PM

My friends that were in statistics gave me a similar problem. They told me that by switching doors, your chances are 66% to get it right. I probably wouldn't bet money on that though. Sure you have an advantage to switch, but 33% to fail is too big for me.

MeklorIlavator

2007-07-09, 11:35 PM

My friends that were in statistics gave me a similar problem. They told me that by switching doors, your chances are 66% to get it right. I probably wouldn't bet money on that though. Sure you have an advantage to switch, but 33% to fail is too big for me.

If you stay, you have a 66% chance to fail, which is greater than 33% last time I checked.

Sir_Norbert

2007-07-10, 07:01 AM

And, y'know, get the answer you're looking for. :smallcool:

No, the point is that labelling the pearls can't change anything relevant to the original question. Therefore the probability in the original question is also 2/3. Don't try to be a smart-ass when you don't know what you're talking about.

Starla, perhaps this explanation will help.

It's all about conditional probability -- you didn't know beforehand that you would pick a white pearl, but given that you have, what is the probability that the other pearl in the same jar is white? Well, to begin with there are six equally probable outcomes (the six pearls). Three of these involve you picking a white pearl. Two of these involve you picking a white pearl from the jar with two white pearls. Therefore the probability of this happening, given that you picked a white pearl, is two out of three.

It doesn't increase anything by re-guessing. After the first door with the raptors is opened, your chances have increased to 50%. Changing will keep it at 50%. No improvement.

It's like flipping a coin twice. You flip a coin, it lands on heads. Now you flip another, what are the chances of it being tails? Would it not improve since you flipped heads last time?

No, it would not. It would still be 50%.

The probability of guessing right is equal to the number of right choices available, divided by the number of choices, time 100%.

1/2 x 100%= 50%

That's how it works.

I can't present you any numbers, but you are wrong. This is one of the few things I remembered from high school stochastic, because it stroke me as weird.

KoDT69

2007-07-10, 08:07 AM

Actually, it's right. I can sort of put together an intuitive proof for you to follow along, if you like.

There are only three configurations the doors can be in:

(:smallsmile: = Quesadillas n' Root Beer

:smalleek: = Raptors)

You can have

1__2__3

:smallsmile: :smalleek: :smalleek:

:smalleek: :smallsmile: :smalleek:

:smalleek: :smalleek: :smallsmile:

and no others, right?

Let's say you pick door #3, initially. The host reveals that Door #1 has a raptor behind it. Now, this tells you that there are two possible configurations, and it's this that gives the impression that you have 50/50 chances. Two options, so an equal chance, right?

Well, not quite. If you pick a door at random at the beginning, you have a 2/3 chance of picking a raptor, right? If you select a raptor, then the host must open the door revealing the last raptor, meaning the other door has the quesadillas and root beer behind it. If you switch doors in these two cases, you get the food!

If, however, you pick the correct door right off (a 1/3 chance), then the host reveals a raptor, but the other one is still hidden. If you switch, you will get it and dies.

Make sense?

This only works when you factor in the starting number into your statistics. The problem is that either door now has a 66% chance to be the winner in that respect. Sure say all you want "at the beginning you picked 1 of 3 doors so that's 33%" but door #3 gets eliminated and now you're down to 2/3 doors. It doesn't matter now because the factor of the problem has decremented from 3 to 2 giving a 50/50 chance. But if you count the eliminated door #3, then each has a 66% chance in that sense, but that's not realistic. Why is an eliminated door still being calculated? The problem has changed. Here look at this from Skenardo's example:

33% chance

:smallsmile: :smalleek: :smalleek:

:smalleek: :smallsmile: :smalleek:

:smalleek: :smalleek: :smallsmile:

By door #3 being eliminated, it removes the 3rd row down AND the 3rd column to the right leaving a 50% chance regardless if you switch:

:smallsmile: :smalleek:

:smalleek: :smallsmile:

If you still leave in a configuration other than that, it's not realistic. Configuration 3 was clearly not right altogether being eliminated, as was door number 3 eliminated from the first 2 configurations. Can anybody really justify the chance to pick a door that's already been revealed a loser?

Azrael

2007-07-10, 08:12 AM

This is the famous "Monty Haul" problem.

No, the Monty Haul problem involves tearing through dungeons and not having sufficient carrying capacity for all the loot, raising the issue of which items to leave behind before you've properly identified them.

This is the Monty Hall Problem.

KoDT69

2007-07-10, 08:13 AM

Ahhh...

Riddle:

You have three vases: one vase containing two white pearls, one vase containing one white and one black pearl, and one vase containing two black pearls. From one of these vases, a pearl is taken. This pearl turns out to be white. What is the probability that the other pearl in the same vase is also white?

The answer:There are three pearls that can be the white pearl that was taken from the chosen vase:

* The first pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The second pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The white pearl from the vase with one white and one black pearl: in this case, the other pearl is black.

The probability that the other pearl in the same vase is also white, is therefore 2/3.

This quirky answer is also incorrect for a good reason. Now that vase #3 is gone, so are 2 of the pearls. That leaves 4 total in which one white has been pulled so there are 2 more white and a black pearl between the 2 remaining vases. The chance would be 2/3 ONLY IF you chose another pearl from either vase. If you chose from the same vase you have a 50% chance to get the black one. Remember, you are pulling the second pearl from the SAME vase, which also eliminates another vase with its 2 pearls.

The Great Skenardo

2007-07-10, 08:17 AM

@KODT

I think it's a still a relevant part of the problem, because there's a 2/3 chance (taken altogether) that the door revealed will be the other raptor door, meaning that the unopened door has the :smallsmile: behind it.

You don't have any way of knowing what will be behind the door you choose at the outset. If you only knew that each door may have either :smalleek: or :smallsmile: behind it, without knowing how many of each exist, then yes, the problem comes down to a 50/50 chance. But since you know that there are two raptor doors and only one :smallsmile: door, then the revelation of one raptor door has a very large impact on the problem. It means that you've either chosen the raptor door to begin with, or that you've chosen the reward door.

But the chances of these two events happening is not equal. You're twice as likely to pick a raptor door at the outset as you are to pick the reward door. Therefore, your best chance comes from switching doors.

No, the Monty Haul problem involves tearing through dungeons and not having sufficient carrying capacity for all the loot, raising the issue of which items to leave behind before you've properly identified them.

This is the Monty Hall Problem.

Indeed. I exist corrected.

Attilargh

2007-07-10, 08:21 AM

You have a one in three chance to pick the right one from the get-go. Then one of the wrong doors is opened.

Now, you've either picked the right door and should not switch, OR picked the wrong door #1 and should switch, OR picked the wrong door #2 and should switch.

Yes, you either have the wrong door or you don't, but you're more likely to have the wrong door to begin with.

Ikkitosen

2007-07-10, 08:28 AM

I'll just add my voice the the "yet another way to explain why you should swap doors" thing...

You pick one door from three. There is a 1/3 chance you have the right door. So there is a 2/3 chance that the correct door is in the other doors, which we'll group into a set of 2.

Now, if the quizmaster offered you your door or both of the other 2 - and all you have to do is find the prize, not avoid any "raptors" - then you'd take the other 2 since there's a 2/3 chance the prize is in there. And with the quizmaster automatically eliminating one wrong door from the set of 2, they effectively become one choice with a 2/3 chance of being correct.

The acid test: Do it. Get 3 pieces of paper and mark one on the back. Do the quiz; pick one, have a friend look and eliminate one blank one from the other 2 and then look to see where the winner is. Within 10 tries you'll notice a 2/3 to 1/3 win ration of swap to don't swap. I have done it, I know :smallsmile:

KoDT69

2007-07-10, 08:28 AM

But it still comes down to a 50/50 chance because door #3 is a loser, so if you pick #1 to start with it HAD a 33% chance before door #3 was eliminated. If you count that eliminated door your win chance becomes 66% as well. All doors have an equal chance to be a winner. Why would it matter if you changed the choice, there are only 2 doors left. I just really don't understand why the 3rd door is calculated with a changed door choice, but not the option to stay?

Let's scale it up double to make a dice reference. Instead of 1-2-3 let's make it 1-2, 3-4, and 5-6 and start with a d6 dice. You roll and get a 1 or 2. The game host now says since you rolled low, he will decrease the dice size to a d4. You have to roll it again. He gives you the choice of rolling for a target of 1-2 or 3-4 on the d4. You have a 50% chance either way regardless if you pick the 1-2 that you got on the first round.

The real issue is that regardless of what you picked to start, the division factor changes the equation when one is eliminated. Your 33% increases to 50%. If you cut a pie into 3 slices and Skenardo eats slice #3, you indeed have no chance of getting the other 2 slices when you only chose 1 of the 3 to begin with.

Ikkitosen

2007-07-10, 08:34 AM

But it still comes down to a 50/50 chance because door #3 is a loser, so if you pick #1 to start with it HAD a 33% chance before door #3 was eliminated. If you count that eliminated door your win chance becomes 66% as well. All doors have an equal chance to be a winner. Why would it matter if you changed the choice, there are only 2 doors left. I just really don't understand why the 3rd door is calculated with a changed door choice, but not the option to stay?

Let's scale it up double to make a dice reference. Instead of 1-2-3 let's make it 1-2, 3-4, and 5-6 and start with a d6 dice. You roll and get a 1 or 2. The game host now says since you rolled low, he will decrease the dice size to a d4. You have to roll it again. He gives you the choice of rolling for a target of 1-2 or 3-4 on the d4. You have a 50% chance either way regardless if you pick the 1-2 that you got on the first round.

Sorry mate, not the same problem at all. I have met people before that couldn't or wouldn't believe/understand the answer of 1/3:2/3, so I suggest you go and try it, accept that it is true, and then try to understand it without the niggling worry that it's the wrong answer.

EDIT2: Note that the chance would be 50:50 if the quizmaster took away one random box from teh two you didn't choose, but since they take away a guaranteed loser you have the situation where they're effectively saying "if the winner is in these 2, I guarantee you'll get it if you swap since I'll remove the losing one", and that's where the 2/3 chance comes from.

Attilargh

2007-07-10, 08:37 AM

But it still comes down to a 50/50 chance because door #3 is a loser, so if you pick #1 to start with it HAD a 33% chance before door #3 was eliminated.

But the probabilities don't change! They don't randomize the position of the raptor between choosing the door and choosing to switch; they are still where they originally were before you chose anything.

Let's put it this way: You choose one box out of three, with something inside every one of them. Ol' Monty tells you that two of those three boxes contain a booby trap, and to demonstrate, blows up one of the booby traps you didn't pick. He now gives you a chance to switch.

Do you still have only 50% chance of sitting there with a bomb on your lap?

KoDT69

2007-07-10, 08:48 AM

Yes you still have a 50% chance increased from the original 33%. You started with 3 but now with the removal of one bomb, you now have 50% chance.

Ikk - I effectively expressed with Skenardo's smiley matrix removing a guaranteed loser from the mix. The proof is in the pudding my friend, 2 :smalleek: and 2 :smallsmile: left in the reduced matrix.

The Great Skenardo

2007-07-10, 08:49 AM

But it still comes down to a 50/50 chance because door #3 is a loser, so if you pick #1 to start with it HAD a 33% chance before door #3 was eliminated. If you count that eliminated door your win chance becomes 66% as well. All doors have an equal chance to be a winner. Why would it matter if you changed the choice, there are only 2 doors left. I just really don't understand why the 3rd door is calculated with a changed door choice, but not the option to stay?

Simply because the presence of the third door at all weighs the odds that the door you initially picked has a raptor behind it. if there was an equal chance that the first door you picked had :smalleek: or:smallsmile: behind it, then the chance would simply be 50/50, and so it wouldn't matter any of you switched or not. The act of the host revealing a raptor door doesn't eliminate the fact that you chose when there were three choices, with the raptor doors being twice as likely to be chosen.

In other words, you made your choice when it was 2/3 likely that you would choose a raptor door to begin with. Revealing the third door doesn't tell you anything about your own door's contents, save that it's still possible for either outcome to happen.

Here's how you can prove it to yourself; Imagine three contestants play this game. Let's say the set-up is identical in each case, but that they choose different doors.

1__2__3

:smallsmile: :smalleek: :smalleek:

So, player one chooses door 1. Player 2 chooses door 2, etc.

See how many of them win when the swapping strategy is used, and how many win when they stay with the door they've chosen.

Ikkitosen

2007-07-10, 08:53 AM

Let me explain again, perhaps slightly differently:

3 boxes, one is "win". You pick one.

Chance of success at this stage 1/3.

Chance of win being in one of the other 2 boxes: 2/3.

Now, different fluff, same crunch.

By removing one incorrect box from the pair of boxes, the quizmaster is effectively guaranteeing that if you swap to one of those boxes, and if the win is in one of them (a 2/3 chance) that you'll get the win since he'll throw away the non-win. If you had picked the win in the first place (1/3 chance) then unlucky, you lose.

Get it?

If you're still unconvinced, go try it.

Attilargh

2007-07-10, 08:56 AM

Yes you still have a 50% chance increased from the original 33%. You started with 3 but now with the removal of one bomb, you now have 50% chance.

You however agree that originally there was a 66% chance of picking up a bomb, no? Then tell me, how does it make sense that, with the removal of that one bomb, you are now 16% safer than with the bomb on the table? How does the destruction of that one bomb somehow make it so that you're less likely to hold the bomb?

KoDT69

2007-07-10, 09:08 AM

How does the 3rd one exploding increase my chances of holding a bomb either?

:smallsmile: :smalleek: :smalleek: = 66% chance of an eek bomb

:smallsmile: :smalleek: = 50% chance of a smile because one eek bomb blew up

The fact is chosing 1 of 3 means there is in fact ALWAYS a loser to expose. When it is out of the picture, there are only 2 choices left. I understand the fact you guys try to justify each of the initial non-chosen items to have a 66% chance, but that is a combined chance. Each one HAD 33% by itself to be a winner. Since you only pick 1 of the 3, there is a 66% chance that you have a loser. That's only counting both because you only get 1 choice to start. If you were to pause the gameshow after the 3rd door was revealed as a loser, then let a new contestant continue, he has a 50/50 chance regardless if he picks the same door as the first contestant. 3-1=2 and that's a fact. So you would say contestant #2 has a better chance of winning if he choses the opposite door than the first contestant? 2 doors left, 1 winner, it seems simple enough.

Kalai_Eljahn

2007-07-10, 09:11 AM

Well, I've seen enough proofs I know it's not 50/50, but I thought based on these arguments we're arguing you're *more* likely to hold a bomb?

I should link my friend here, he's ahead of me in math. (Shouldn'ta failed Calc II then skipped a year.)

KoDT69

2007-07-10, 09:16 AM

Enough proofs? So the visual I posted isn't enough of a quick check? When you reduce the number of options, the odds change. I guarantee if you had a math problem representative of this situation and guessed 66% it would be wrong. Here is how it looks:

Choices / Doors to choose from:

1 / (3-1) = 50%

You only get one door to choose. When one of the 3 options is removed, that leaves 2 choices. What's so hard to understand about that? :smallsigh:

When you change the number of choices, the results will be different.

Attilargh

2007-07-10, 09:21 AM

How does the 3rd one exploding increase my chances of holding a bomb either?

It doesn't; that's what I'm trying to argue here. No matter what, you have a 66% chance of having a bomb on your lap, whether the other bomb has been detonated or not.

It seems to me you're looking at this as three distinct outcomes, whereas the Monty Hall problem only has two. It doesn't matter if the raptor/bomb/goat is blue or black, it's still a goat and you lose if you pick it.

When you make your pick the first time, you are more likely to pick the losing option. Now, because you've most likely picked the losing option, the other remaining option is most likely the winning option. Ergo, you should switch.

Ædit: Marilyn vos Savant (http://en.wikipedia.org/wiki/Marilyn_vos_Savant) disagrees (http://en.wikipedia.org/wiki/Monty_hall_problem) with you.

Kalai_Eljahn

2007-07-10, 09:25 AM

My dear friend, I trust the detailed and sound proofs of those with doctorate degrees, *and* my science magazines, over a misleading visual. Unfortunately I could not replicate the proof any better than those here have done.

<off that particular topic>

According to my calculator, e^pi*i = e^0 = 1. Pi*i is most certainly, however, not equal to 0. Yet another example of not being able to trust '^0'.

The Great Skenardo

2007-07-10, 09:31 AM

disagrees (http://en.wikipedia.org/wiki/Monty_hall_problem) with you.

D'oh. I should have pointed you there to begin with; the page has a very good explanation.

EDIT: You may appreciate this image in particular: It lays out quite methodically all possible ways the game could go:

http://upload.wikimedia.org/wikipedia/commons/thumb/9/9e/Monty_tree.svg/350px-Monty_tree.svg.png

Ikkitosen

2007-07-10, 09:34 AM

Enough proofs? So the visual I posted isn't enough of a quick check? When you reduce the number of options, the odds change. I guarantee if you had a math problem representative of this situation and guessed 66% it would be wrong. Here is how it looks:

Choices / Doors to choose from:

1 / (3-1) = 50%

You only get one door to choose. When one of the 3 options is removed, that leaves 2 choices. What's so hard to understand about that? :smallsigh:

When you change the number of choices, the results will be different.

Speaking of options, I see 2:

1. You have a fundamental misunderstanding of mathematics; you seem to think that a 2/3 chance of holding a bomb becomes a 1/2 chance when you affect something you're not holding.

2. You are messing with us; you know the answer is as we have presented and are just having us all on :smallamused:

Seriously, I shall place my reputation as a PhD on the line when I tell you that the solution I have presented is correct. Go back, re-read the explanations, act the situation out a few times. It will become obvious, and then you just need to have the "ping" moment where you get it. Seriously, I promise.

To keep things moving, can you refute the concise explanation I posted in my last post?

EDIT: Lol, Wiki ftw.

MeklorIlavator

2007-07-10, 09:37 AM

Enough proofs? So the visual I posted isn't enough of a quick check? When you reduce the number of options, the odds change. I guarantee if you had a math problem representative of this situation and guessed 66% it would be wrong. Here is how it looks:

Choices / Doors to choose from:

1 / (3-1) = 50%

You only get one door to choose. When one of the 3 options is removed, that leaves 2 choices. What's so hard to understand about that? :smallsigh:

When you change the number of choices, the results will be different.

Think of it this way. If you change, you are picking 2 doors, but if you don't change you are only picking one. This means that the odds never change because the number of choices never do.

The Great Skenardo

2007-07-10, 09:45 AM

Here's a fun one for you: Galileo's classic experiment showed Empirically that two objects of dissimilar mass fall to Earth at the same rate, and indeed, in physics classes the world over, the mass of an object falling on Earth is taken to be irrelevant when considering its mass.

But consider this thought experiment:

A soda can weighing just a few grams is allowed to fall to Earth from a distance of 1 meter. Using the classic representation of 9.81 m/s^2 as the acceleration, we can determine how quickly the can falls to Earth.

Now consider a can of soda which (somehow) has been imbued with a mass equal to that of Mercury. This can is allowed to drop from a height of 1 meter. Will it fall at the same rate? (i.e. will it experience an accelleration of 9.81 m/s^2 ?)

douglas

2007-07-10, 09:46 AM

When you reduce the number of options, the odds change.

Incorrect. When you reduce the number of options and rerandomize them, the odds change. In this situation, however, nothing changes places after the original placement.

Try this logic: You go in planning from the beginning that you're going to switch. So, you pick a door at random and say "I bet that's a raptor. I'll take whichever door you do not open." The host then reveals a raptor and you pick the other door. The host's knowledge of where the raptors are has effectively allowed you to pick two doors at once.

Or, how about this: Instead of three doors there are a million doors. You pick one, the host reveals 999,998 raptors and then asks you whether you want to switch to the one door out of the whole million-1 that he didn't open. If you do not switch, you are betting that your original one-in-a-million chance came up with the right door. Switching is equivalent to saying "I think my first pick (with it's 1/1,000,000 odds) was wrong." A pretty safe bet, I'd say.

The Great Skenardo

2007-07-10, 09:49 AM

*Wisdom*

Very well put! :smallsmile:

Ikkitosen

2007-07-10, 09:53 AM

Here's a fun one for you: Galileo's classic experiment showed Empirically that two objects of dissimilar mass fall to Earth at the same rate, and indeed, in physics classes the world over, the mass of an object falling on Earth is taken to be irrelevant when considering its mass.

But consider this thought experiment:

A soda can weighing just a few grams is allowed to fall to Earth from a distance of 1 meter. Using the classic representation of 9.81 m/s^2 as the acceleration, we can determine how quickly the can falls to Earth.

Now consider a can of soda which (somehow) has been imbued with a mass equal to that of Mercury. This can is allowed to drop from a height of 1 meter. Will it fall at the same rate? (i.e. will it experience an accelleration of 9.81 m/s^2 ?)

Since you spelled Mercury with a capital, can we assume you mean the planet and not the liquid at STP metal? Plus, since this is physics, you need to tell us what assumptions we can make (like in a vacuum, neglecting air resistance, etc.)

My first instinct is that the Earth will move too due to the huge mass of the can, making things a bit more complicated.

EDIT: Great work douglas :smallsmile:

douglas

2007-07-10, 09:54 AM

But consider this thought experiment:

A soda can weighing just a few grams is allowed to fall to Earth from a distance of 1 meter. Using the classic representation of 9.81 m/s^2 as the acceleration, we can determine how quickly the can falls to Earth.

Now consider a can of soda which (somehow) has been imbued with a mass equal to that of Mercury. This can is allowed to drop from a height of 1 meter. Will it fall at the same rate? (i.e. will it experience an accelleration of 9.81 m/s^2 ?)

Depends. Are the two cans in a vacuum when they fall? Otherwise, air resistance will slow down the normal can a lot more than the Mercury can.

Of course, there's also the matter of relatavistic effects from such a great mass, but to talk meaningfully about those you need to specify where your observer is and how fast he's moving and/or accelerating, and you need to use some equations I'm not familiar with.

Oh yeah, there's also the matter of Earth falling towards the can when it has the mass of Mercury and the gravity that comes with it.

The Great Skenardo

2007-07-10, 10:00 AM

Clarifications: we assume air resistance to be irrevant here. If you like, the cans can be in large vaccums, as Galileo ignored air resistance as well.

And yes, I did mean the planet. If you like, you can tone back the mass of the second can to have the same mass as 1,000,000 Marlon Brandos.

But if the Earth falls towards the can as well, then isn't it only a question of scale? That is, if the Earth falls to some degree towards the Mercu-Can, then does it not also fall towards the Diet-Can? That would seem to indicate that Galileo's experiment is, strictly speaking, incorrect.

Indon

2007-07-10, 10:02 AM

Now consider a can of soda which (somehow) has been imbued with a mass equal to that of Mercury. This can is allowed to drop from a height of 1 meter. Will it fall at the same rate? (i.e. will it experience an accelleration of 9.81 m/s^2 ?)

It should fall slightly faster, I would think, because the gravity of the two masses is much greater than that of the earth and an object with an imperceptible gravity field like a soda can.

(It seems someone's already pointed out, "But both objects fall towards each other!", might I point out that technically, the earth still falls towards all objects falling towards it, as all mass has gravity. It's just that most things we see falling don't have significant gravity)

Am I right?

MeklorIlavator

2007-07-10, 10:02 AM

Here's a fun one for you: Galileo's classic experiment showed Empirically that two objects of dissimilar mass fall to Earth at the same rate, and indeed, in physics classes the world over, the mass of an object falling on Earth is taken to be irrelevant when considering its mass.

But consider this thought experiment:

A soda can weighing just a few grams is allowed to fall to Earth from a distance of 1 meter. Using the classic representation of 9.81 m/s^2 as the acceleration, we can determine how quickly the can falls to Earth.

Now consider a can of soda which (somehow) has been imbued with a mass equal to that of Mercury. This can is allowed to drop from a height of 1 meter. Will it fall at the same rate? (i.e. will it experience an accelleration of 9.81 m/s^2 ?)

No. The Formula for acceleration due to gravity is mass(one) times Mass(two) divided by the square of the sum of the distance between the two, all multiplied by the gravitational constant. The reason that you have -9.8 m/s as the acceleration due to gravity is that the equation produces the same general number when when looking at small masses( read: non-planetary) if they are close to a planet. The acceleration due to gravity in the second example is 3.24233*10(to the 30th power) m/s(squared)

Zafuel

2007-07-10, 10:04 AM

I'm afraid he's right. It's a very famous problem, and caused a lot of respectable mathmaticians to look like idiots. I'll show you by means of a flow-chart.

Choice 1----------If You Stick-------- If You Change

Prize door-------- Prize----------------- Nothing

Trick door-------- Nothing--------------- Prize

Trick door-------- Nothing-------------- Prize

Two times out of three you win if you change, when compared to only once if you do not.

The confusion arises because it's assumed that the second choice is independant from the first. The host has set the second choice to ensure that switching will [i]always change the result.[i/] If the result inverts, then so does the probability. Hence, the result.

Attilargh

2007-07-10, 10:05 AM

Hey, good catch. The two planetary-scale masses would pull at each other, which would result in a "higher" acceleration on part of Mercury-in-a-can as observed from Earth.

But really, can-Mercury would still fall at the rate of 9.81 m/s². It's just that Earth would be simultaneously going up at the rate of 3.701 m/s².

Or something. I don't know, I'm still learning this stuff.

Ikkitosen

2007-07-10, 10:07 AM

MeklorIlavator has it right with F = -G M(1) M (2) divided by r squared. 9.81 is the acceleration due to gravity at some average definition of the earth's surface, and assumes the earth doesn't move.

EDIT: This has been tested (the variation in g with height, not a can the mass of Mercury). You take a pendulum, whose period is 2 pi root(length/g) and measure its period at sea level and up Everest - they're measurably different.

valadil

2007-07-10, 10:10 AM

Here's another stab at explaining the Monty Hall problem:

Let's look at it from each algorithm. Either, we always switch or we never switch.

Under never switch, you have a 1/3 chance in being right the first time. I don't think this is disputable. You're picking one of three things and the fact that an incorrect answer gets removed doesn't change your choice so it doesn't matter.

Under always switch, a bad answer gets removed after you pick. So if you get :smallmad: first, the other :smallmad: goes away and you switch to the correct answer. The only way to get the answer wrong is to guess :smallsmile: first. Then a bad answer remains after one of them goes away. You have a 1/3 chance of landing on the good prize on your first try and losing the game. Therefore with this method you have a 2/3 chance to win.

The trick is that this looks like a probability question that gets complex as time goes on. It's not. It's a logic puzzle. You just have to look at what happens if you guess right or wrong under never switch and always switch. Always switch lets you win if you guess one of the two bad doors, but never switch only wins if you guess the single good door.

douglas

2007-07-10, 10:14 AM

The Formula for acceleration due to gravity is mass(one) times Mass(two) divided by the square of the sum of the distance between the two, all multiplied by the gravitational constant.

No, that's the formula for force. To get acceleration you have to divide out the mass of the object you're calculating for, which ends up which an expression that does not involve that mass at all because it's on both the top and bottom of the quotient and cancels itself out.

That would seem to indicate that Galileo's experiment is, strictly speaking, incorrect.

If you count the speed of falling as the rate the distance between the object and the Earth changes, then yes Galileo's experiment is technically incorrect. The difference due to the Earth falling towards the object is, however, negligible for practically any object that would reasonably be used in such an experiment on Earth and disappears entirely if you somehow hold the Earth motionless.

Azrael

2007-07-10, 10:24 AM

If anyone has ANY question about the validity of the switch in the Monty Hall Problem, go read the associated Wikipedia entry. (http://en.wikipedia.org/wiki/Monty_hall_problem)

Then, if you wish, follow the cited links, Google "Monty Hall Problem" and bask in the great quantity of formal, informal and visual proofs showing the counter intuitive switch does increase your odds of winning.

Then, finally, realize this: You are not smarter than the entire mathematics & statistics community. You have not found some great societal misunderstanding. You are wrong.

The odds switch is FACT. Not subjective interpretation.

Indon

2007-07-10, 10:31 AM

In the event that you've been reading this thread and your mind has not been sufficiently blown yet, I give you Four-dimensional cubes (http://en.wikipedia.org/wiki/Tesseract).

Have fun.

Khantalas

2007-07-10, 10:33 AM

Of course, I have difficulty thinking in three dimensions. Four just blows my mind.

Damn you, hyperobjects!

KoDT69

2007-07-10, 10:35 AM

Speaking of options, I see 2:

2. You are messing with us; you know the answer is as we have presented and are just having us all on :smallamused:

Yep, you got me. I'm an Electronic Engineer making math a strong suit. This is an example of why a pictorial diagram is not always the best option. Math rules! :smallwink:

Let's look at it from each algorithm. Either, we always switch or we never switch.

This is the mathematical principle I was waiting for. :smallbiggrin:

valadil

2007-07-10, 10:41 AM

This is the mathematical principle I was waiting for. :smallbiggrin:

*shrug* I'm a programmer.

Attilargh

2007-07-10, 10:47 AM

Yep, you got me. I'm an Electronic Engineer making math a strong suit. This is an example of why a pictorial diagram is not always the best option. Math rules! :smallwink:

And this is why I generally dislike people: They make me look foolish. :smalltongue:

Ikkitosen

2007-07-10, 10:54 AM

Trust an engineer to know how to make friends.

*ducks thrown objects* :smallamused:

KoDT69

2007-07-10, 11:29 AM

Well that banter was really an exeplification of how the school systems teach math to the students. Sometimes they either over-simplify a situation, or try to teach on method that relies on another that the students have not been taught yet.

Attilargh - I wasn't trying to make you look foolish, I was merely playing devil's advocate in the matter.

As a practical exercise in this topic I went and quizzed some of my co-workers on the 2 views, and most were just confused :smallconfused: You can work out the always/never switch methos on paper real quick to verify. I had to show them my drawing :smallyuk: Sad thing is some of them have higher degrees than I do but are so long out of school they just forget the math stuff for some reason.

EDIT - Yes I'm at work. When it's not busy here, I hit the playground! Sad as it is I spend a lot of time on here lately whilst on the job :smallwink:

Attilargh

2007-07-10, 12:08 PM

Attilargh - I wasn't trying to make you look foolish, I was merely playing devil's advocate in the matter.

Well, that was probably just in my own eyes. But it's alright, it happens to me all the time. :smallcool:

Pyrian

2007-07-10, 12:58 PM

No, the point is that labelling the pearls can't change anything relevant to the original question.That is correct, but it has absolutely nothing to do with my point (which BTW was referring to specifying how the first white pebble was chosen, as I commented on in my previous post). I find your incorrect assumption about my criticism of unstated assumptions rather amusing.

Don't try to be a smart-ass when you don't know what you're talking about.Don't try to be a smart-ass when you're not paying attention to what other people are saying:

...if the pebble were picked randomly and turned out to be white, or if a random white pebble were picked, then yes, there's a 2/3 chance the other pebble in same vase is white. On the other hand, if a white pebble was selected from a random white-pebble-containing vase, then the odds are 50%.

Telonius

2007-07-10, 01:15 PM

Simple explanation for the door-switching: making the switch is the same thing as betting that you were wrong in the first guess. You only had a 1/3 chance of picking the right door the first time.

The additional information you get (the location of one of the raptors) isn't really useful information. If you'd known which door the raptor was behind before you picked, it really would be a 50/50 shot. But since you didn't know until after you've picked, it has no bearance on your current situation.

Atreyu the Masked LLama

2007-07-10, 01:26 PM

awwww....i thought this thread was going to be about interesting mathematical word play equations.

Aphrodite is love

love is blind

Ray Charles is blind

Ray Charles is Aphrodite?

Things of that nature.

Ah well, I suppose I'll have to settle for actual mathematics.

Hades' Watchdog

2007-07-10, 02:11 PM

Here's a problem: we have a coin. The coin is biased either towards heads or towards tails (we don't know which). I flip the coin, and based on how that flip turns out, I guess which way the coin is biased.

Then you flip the coin, and based on the results of both your flip and my flip, you guess which way the coin is biased. If our guesses are the same, we forget everything and repeat the game. If they're different, a genie appears or something and tells us which way the coin is biased.

And another: we were playing catch with an asbestos gerbil ball, and the ball ended up sinking into a lake of lava. To determine which of us has to swim down to get it, I offer this game. I hold all four ends of two pieces of string in my hands. You pick two ends and pull. If you get both ends of the same piece of string, I have to retrieve the gerbil. If, on the other hand, you end up holding both pieces of string, you have to go swimming.

Are these games fair? If they aren't, who's more likely to win, and by how much?

Silkenfist

2007-07-10, 02:58 PM

Game 1 is perfectly fair. You go first and guess that the coin is biased towards the side that ended up. When I flip the coin and get the second result, there are two possibilities:

A: Each side came up once. I can contradict, but the probability of either side being the biased one is equal. So there is no advantage for me.

B: One side came up twice. Obviously it would be the better bet to choose that side. However, you already chose it and joining your guess will just make the game continue.

There is no possible way for me to give myself an advantage and I can continue the game endlessly without allowing to be put at an disadvantage. The game is fair.

Game 2 is not fair. Let's assume, I pick two ends sequentially (doesn't alter the probabilities but it makes the game more transparent). I pick the first end, then look at the threads that remain in your hand. Three have remained, but only one of them leads to me winning the game. Two options lead to me losing. The game is not fair at all.

Reinboom

2007-07-10, 03:48 PM

mmmn, doing the "test" thing, though, not by hand -- too slow and too narrow results. Instead, using php to do it for me 10,000 times over.

<html><head><title>Monty Hall</title></head><body><?php

$ss=array(0,0);//Switch, Stay

for($a=0;$a<10000;$a++){

if(rand(0,2)==2){// Chose correct door right off. Stay and you're right, switch and you're wrong.

$ss[1]+=1;

}

else{// Choose wrong door right off, switching means you choose one of the other two doors, either a correct or a wrong one. Staying means you're wrong.

$ss[0]+=rand(0,1);

}

}

echo "Switch: ".$ss[0]." out of 10000 (".($ss[0]/100)."%)<br>Stay: ".$ss[1]." out of 10000 (".($ss[1]/100)."%)";

?></body></html>

http://pifro.com/tempmove/mh.php

it consistently comes up ~33% for both. (The rest is the times you lose.)

I see no benefit in choosing switching over staying. Perhaps my logic is off for the program? Could someone write a program to do the test that shows it logically producing higher results for switching?

Hades' Watchdog

2007-07-10, 04:20 PM

I'm unfamiliar with php, so I wrote a program in Matlab (which, in my opinion, has much better formatting. I also included much commenting.).

function Threedoors

wins = 0;

rounds = 0;

% Because we haven't played yet.

for x = 1:100000

%"ceil" rounds up. "rand" is a random number between 0 and 1, generated anew

% each call.

PrizeDoor = ceil(rand*3);

% Now we have a prize behind one of the doors.

FirstGuess = ceil(rand*3);

% That's our initial guess. Now Monty opens one of the doors with a

% velociraptor.

for m = 1:3

if not( m==PrizeDoor ) & not( m==FirstGuess )

OpenDoor = m;

end

end

% All right. So what that did is pick the highest number that isn't the one

% we guessed OR the one with the prize and set the variable "OpenDoor" to

% it.

for n = 1:3

if not( n==FirstGuess ) & not( n==OpenDoor )

if n==PrizeDoor

wins = wins + 1;

rounds = rounds + 1;

else

rounds = rounds + 1;

end

end

end

% That loop found the door that isn't open AND isn't the one we initially

% picked (because, since we're switching, we switch to it). Then, if it has

% the prize behind it, it increments "wins" by one and "rounds" by one.

% Otherwise, it only increases "rounds".

end

display(wins/rounds)It keeps giving me ~.666.

Oh, and in case it's unclear, "==" means "compare," while "=" means "set this variable." So you use "==" in "if" loops, and "=" if you want to change a value.

adanedhel9

2007-07-10, 05:37 PM

mmmn, doing the "test" thing, though, not by hand -- too slow and too narrow results. Instead, using php to do it for me 10,000 times over.

it consistently comes up ~33% for both. (The rest is the times you lose.)

I see no benefit in choosing switching over staying. Perhaps my logic is off for the program? Could someone write a program to do the test that shows it logically producing higher results for switching?

I don't know php, but as I understand it the basic logic of your programs runs something like this:

If the contestant chose correctly to begin with, then switching is never advantageous.

If the contestant chose incorrectly, then switching will be advantageous 50% of the time.

Am I right? In that case, I see the problem. If the contestant chose incorrectly, then switching is always advantageous. I think you missed the assumption that the emcee always opens a door containing a raptor which the contestant did not choose. This ensures that if the contestant chose wrongly to begin with, then the other unopened door must have the prize.

The Great Skenardo

2007-07-10, 05:44 PM

awwww....i thought this thread was going to be about interesting mathematical word play equations.

Aphrodite is love

love is blind

Ray Charles is blind

Ray Charles is Aphrodite?

Things of that nature.

Ah well, I suppose I'll have to settle for actual mathematics.

Hm. A bit of reduction:

A = Aphrodite

B = Blind

L = Love

R = Ray Charles

xiy = x is y

We want to prove: RiA

1. AiL (Given)

2. LiB (Given)

3. RiB (Given)

We see that this only works if "is" is Reflexive and Commutitive. That is,

AiB = BiA

and

AiB +BiC = AiC

The question becomes,

if an apple is red,

are all red things apples?

So, to quote a well-known politician, 'It depends on what your definition of "is" is." :smallamused:

Silkenfist

2007-07-10, 06:06 PM

I have another easy one for you: You just have to calculate stuff.

I will offer you a game. I have five tokens. Two yellow ones and three red ones. I put them in a darkened box and let you draw a token. If the token is yellow, I give you one dollar. If the token is red, you give me a dollar. Afterwards the token is thrown away and you are given the option to draw again. This continues until there is no token left or you choose to stop.

Is the game fair? If not, to whom is it unfair?

douglas

2007-07-10, 07:12 PM

Surprisingly, the game is in fact fair. I calculated the entire game tree: at no point is stopping superior to continuing if a yellow token remains; as long as you stop after drawing the second yellow token, the average outcome is no gain or loss.

mikoto

2007-07-10, 07:22 PM

awwww....i thought this thread was going to be about interesting mathematical word play equations.

Aphrodite is love

love is blind

Ray Charles is blind

Ray Charles is Aphrodite?

Things of that nature.

Ah well, I suppose I'll have to settle for actual mathematics.

you mean someting like this

http://www.randomjoke.com/topic/nerd.php?29025

adanedhel9

2007-07-10, 07:30 PM

Surprisingly, the game is in fact fair. I calculated the entire game tree: at no point is stopping superior to continuing if a yellow token remains; as long as you stop after drawing the second yellow token, the average outcome is no gain or loss.

Not true; the odds run against you to continue after certain points. By stopping at the right times, you gain the advantage:

40% of the time, I draw yellow (2/5), and stop (I earn a dollar).

30% of the time, I draw red (3/5), then yellow (2/4), and stop (I break even).

10% of the time, I draw red (3/5), red (2/4), yellow (2/3), yellow (1/2), and stop (I break even).

10% of the time, I draw red (3/5), red (2/4), yellow (2/3), red (1/2), yellow (1/1) (I lose a dollar).

10% of the time, I draw red (3/5), red (2/4), red (1/3), yellow (2/2), yellow (1/1) (I lose a dollar).

Which gives me an average of $0.20 per game.

Pyrian

2007-07-10, 07:50 PM

20% of the time, I draw red (3/5), red (2/4), yellow (2/3), and stop (I lose a dollar).That's a bad stop. At that point there's one red and one yellow left, so the worst you can do by continuing is break even. So, instead you make another draw at 50/50: if it's yellow, you erase your loss and then stop (breaking even), and if it's red you draw the last yellow putting you back at losing a dollar.

adanedhel9

2007-07-10, 08:39 PM

That's a bad stop. At that point there's one red and one yellow left, so the worst you can do by continuing is break even. So, instead you make another draw at 50/50: if it's yellow, you erase your loss and then stop (breaking even), and if it's red you draw the last yellow putting you back at losing a dollar.

You're right. I'm not sure why I didn't expand on that branch. I'll edit that in.

Starla

2007-07-10, 11:34 PM

Ahhh...

OK. I'll write something. It's 1.53 in the morning here, but OK.

Riddle:

You have three vases: one vase containing two white pearls, one vase containing one white and one black pearl, and one vase containing two black pearls. From one of these vases, a pearl is taken. This pearl turns out to be white. What is the probability that the other pearl in the same vase is also white?

The answer:There are three pearls that can be the white pearl that was taken from the chosen vase:

* The first pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The second pearl from the vase with two white pearls: in this case, the other pearl is also white.

* The white pearl from the vase with one white and one black pearl: in this case, the other pearl is black.

The probability that the other pearl in the same vase is also white, is therefore 2/3.

Oh Thor, please let me be not simu'd on this one....

I'll contradict once again. The chance is 2/3. How so?

Let's alter the situation slightly to make it more transparent. I am writing letters on the pearls, naming the white ones A, B and C and the black ones X, Y and Z. Then, I put on a blindfold, shuffle them around and place them in the vases as in the pattern written above.

Now I shuffle the vases around and draw one pearl from any of them. I remove the blindfold and see: It is the white pearl A. Let's have a look at the possible options now. There are three possibilities for me to distribute two white pearls in one vase:

1/3 chance of A&B being in one vase, C in the other one.

1/3 chance of A&C being in one vase, B in the other one.

1/3 chance of B&C being in one vase, A in the other one.

In have picked A, so in the first two cases I will draw another white pearl. Only in the third case, A shares a vase with a black pearl. Thus, the probability of drawing a white pearl again is 2/3

The explanation in the original poster's explanation in the spoiler makes more sense, but you both agree and that makes you right. I was thinking in terms of 2 vases instead of 3 stones.

Here is an easy one for you all:

There are 12 brown socks and 12 black socks in your drawer. They are not folded together and it is dark and you don't want to turn on the light and wake up your spouse to get a pair. How many socks will you have to remove to make a pair?

dungeon_munky

2007-07-11, 12:52 AM

This means you need to make assumptions. You need to remove two socks to make a pair. Unless you want them to match, in which case the answer is three.

Unless you dont mean a pair of socks, and mean a pair of somthing else, say, you and your spouse, in which case you dont need to remove any.

Or alternatively, you dont need to remove any to make a pair cause you already have 12 in your drawer. Or you need to remove 22 socks to leave a pair in the drawer.

I should stop over thinking this.

The Great Skenardo

2007-07-11, 06:45 AM

The explanation in the original poster's explanation in the spoiler makes more sense, but you both agree and that makes you right. I was thinking in terms of 2 vases instead of 3 stones.

Here is an easy one for you all:

There are 12 brown socks and 12 black socks in your drawer. They are not folded together and it is dark and you don't want to turn on the light and wake up your spouse to get a pair. How many socks will you have to remove to make a pair?

Three.

Course, until you turn the light on, you won't know which one's the odd sock out.

Reinboom

2007-07-11, 07:04 AM

I don't know php, but as I understand it the basic logic of your programs runs something like this:

If the contestant chose correctly to begin with, then switching is never advantageous.

If the contestant chose incorrectly, then switching will be advantageous 50% of the time.

Am I right? In that case, I see the problem. If the contestant chose incorrectly, then switching is always advantageous. I think you missed the assumption that the emcee always opens a door containing a raptor which the contestant did not choose. This ensures that if the contestant chose wrongly to begin with, then the other unopened door must have the prize.

Actually, yes, I see the mistake in that code.

<html><head><title>Monty Hall</title></head><body><?php

$ss=array(0,0);//Switch, Stay

for($a=0;$a<10000;$a++){

if(rand(0,2)==2){// Chose correct door right off. Stay and you're right, switch and you're wrong.

$ss[1]+=1;

}

else{// Choose wrong door right off, switching means you choose one of the other two doors, either a correct or a wrong one - the wrong one is open so would never be chosen, and thus switching is automatically correct. Staying means you're wrong.

$ss[0]+=1;

}

}

echo "Switch: ".$ss[0]." out of 10000 (".($ss[0]/100)."%)<br>Stay: ".$ss[1]." out of 10000 (".($ss[1]/100)."%)";

?></body></html>

I now understand it. Since there is only a 1/3 chance of the original choice landing on the correct answer where switching is wrong. That means that the first logic (Chose correct door right off. Stay and you're right, switch and you're wrong.) only occurs 1/3 times. Since whether you are switching or staying is definite right or wrong in both cases.

The 2nd logic (Choose wrong door right off, switching means you choose one of the other two doors, either a correct or a wrong one - the wrong one is open so would never be chosen, and thus switching is automatically correct. Staying means you're wrong.) occurs 2/3 of the time however.

modified:

http://pifro.com/tempmove/mh.php

There's the test, you can keep refreshing the page and it consistently comes up 66.6% ~ ish most of the time. (Testing it 10,000 times)

Telonius

2007-07-11, 02:32 PM

Another math puzzle for you all. You go to your local butcher, wanting to buy some beef. He has a sign up: $7.00/lb if he chooses the cut, $10.50 if you choose it. There are 20 pounds available. Now you know that the butcher is going to give you a poorer cut of beef if you let him choose. But you also want to get the best deal for your money. What is the optimal solution in this situation?

Indon

2007-07-11, 02:38 PM

Another math puzzle for you all. You go to your local butcher, wanting to buy some beef. He has a sign up: $7.00/lb if he chooses the cut, $10.50 if you choose it. There are 20 pounds available. Now you know that the butcher is going to give you a poorer cut of beef if you let him choose. But you also want to get the best deal for your money. What is the optimal solution in this situation?

Buy 20 lbs of meat, he can choose the cuts! :P

But seriously, I imagine it'd depend on how many pounds of meat you wanted. If you had no upper limit, getting all 20 lbs at his cut choice would be best.

Hades' Watchdog

2007-07-11, 02:52 PM

<puzzle>If there's one pound of rotting, practically liquid meat and 19 pounds of delicious-looking meat, then an extra $3.50 would probably be better. But since I don't know what cost you're associating with meat quality, I believe that I need more information.

BugFix

2007-07-11, 03:21 PM

Another math puzzle for you all. You go to your local butcher, wanting to buy some beef. He has a sign up: $7.00/lb if he chooses the cut, $10.50 if you choose it. There are 20 pounds available. Now you know that the butcher is going to give you a poorer cut of beef if you let him choose. But you also want to get the best deal for your money. What is the optimal solution in this situation?

This problem is underspecified. As Indon points out, handing him $140 will get you the whole thing at the cheap rate, so clearly anything more than 13 1/3 pounds isn't worth buying at the higher rate.

But more to the point, what's the definition of "optimal" here? How do you quantify quality differences (i.e. is twice as much mean of "half" the quality the same value?). How is the "quality" distributed through the set? If all the meat is of identical quality, then obviously it doesn't matter who chooses. If there are 19 pounds of scrap and 1 pound of "Über-Kobe", then clearly you want to buy that one on your own.

Basically, it seems like you're missing a piece here.

Jimmy Discordia

2007-07-11, 03:47 PM

Ooooh, I love the Monty Hall problem. I don't have anything meaningful to add to this, except that it's so much fun to pull on people who've never heard it before. We argued this for one full hour and part of a second one in my Stat class (I've only ever taken one), until we finally ran a simulation to show that you're better off switching.

For the record, I was firmly in the "50/50" camp until I saw the simulation. What can I say, I'm a literary genius, not a mathematical one.

I have a whole host of questions that people nearly always get wrong when they rely on intuition. Since most of them don't actually rely on probability (or do so, but would take too long to explain), I'll leave them out of this thread... but training in psychology with a strong focus on decision theory provides one with an impressive list of trick questions to ask the unwary. Maybe we need another thread for heuristics and biases. I'm pretty sure hilarity would ensue.

Fun fact: any time an experimental psychologist asks you a question (especially an either/or question), the right answer is most likely to be the one that seems intuitively wrong to you. I dubbed this the "never trust a psychologist" heuristic.

Silkenfist

2007-07-11, 04:07 PM

Another nice one:

Your doctor does a screening for a terminal illness. He analyses your blood sample and tells you that you have been tested positive for a rare lethal disease, that only 0.5% of the population share - but which would be uncurable and lethal within short time. You ask him whether the test is safe and he tells you, it would be rather safe, with healthy subjects being tested positive wrongfully only in 5% of the tests. You are not sure what to make out of it.

Question 1: Which figure do you still need to calculate your chances of not being infected?

Figure in the Spoilers to continue the calculation.

5% error rate

Question 2: How much ARE your chances of being healthy?

Indon

2007-07-11, 04:25 PM

Without actually doing the math (math is hard!), I'd stab at about 91% chance of not being sick despite testing positive.

dungeon_munky

2007-07-11, 04:25 PM

there is a 91.28% chance that you are healthy but tested positive. Done by dividing the probability of being healthy and testing positive (0.995 x 0.05) by the probability of testing positive (0.995 x 0.05) + (0.005 x 0.95)

EDIT=Good guess there!

Indon

2007-07-11, 04:49 PM

EDIT=Good guess there!

I approximated it as 1/11'th. 5% false positive + .5% sick, ignoring the crossover. That's about 9%.

Starla

2007-07-12, 12:56 AM

This means you need to make assumptions. You need to remove two socks to make a pair. Unless you want them to match, in which case the answer is three.

Unless you dont mean a pair of socks, and mean a pair of somthing else, say, you and your spouse, in which case you dont need to remove any.

Or alternatively, you dont need to remove any to make a pair cause you already have 12 in your drawer. Or you need to remove 22 socks to leave a pair in the drawer.

I should stop over thinking this.

Chuckles. Okay, I was in a hurry when I wrote it. But really the case is you take 3 and go into the other room to see which ones match. :smalltongue:

By the way, I had another fun thing--a math problem I drew in Corel Painter 3 but I could not save it on photobucket. Any suggestions as to what I should do? I wanted to show you all, but it woud be hard to put in html format, hence the drawing.

mikoto

2007-07-12, 08:12 AM

What is preventing it from being on photobucket?

If you have a scanner you can print it off and scan it because I believe you can put scanned images on photobucket

Hades' Watchdog

2007-07-12, 09:58 AM

This may ruin Silkenfist's problem if you haven't done it yet.

Question 1: Which figure do you still need to calculate your chances of not being infected?

Figure in the Spoilers to continue the calculation.

5% error rate

...healthy subjects being tested positive wrongfully only in 5% of the tests.

I don't get it; the error rate was already given outside of the spoiler.

Telonius

2007-07-12, 10:21 AM

Buy 20 lbs of meat, he can choose the cuts! :P

But seriously, I imagine it'd depend on how many pounds of meat you wanted. If you had no upper limit, getting all 20 lbs at his cut choice would be best.

Aww, somebody heard this riddle before. :smallbiggrin:

Hades' Watchdog

2007-07-18, 10:15 PM

Vaguely mathy and definitely quirky. I learned something the other day.

So, you take a cone (a full cone, not a half-cone; if you don't know the difference, sorry, but you're not my target audience. For that matter, skip this paragraph and the next two if you know what a hyperboloid of one sheet is.). Let's say that it opens upwards and downwards. Now we cut it with a vertical plane. The intersection is a shape called a hyperbola. It has many neat properties, but none of them are important for this. It looks like this:

http://speeze.pearson.googlepages.com/HyperbolaGITPPic.gif

So, now we spin it around a horizontal line that goes through its center.

http://speeze.pearson.googlepages.com/Hyperboloid1.gif

Forgive the size; I was going to make it my avatar at one point. Anyway, you get a shape like that. Like a donut that never quite reaches a slope of 45 degrees and so goes on forever. This is called a hyperboloid of one sheet.

I'm almost done. So, you have this hourglass-shaped thing. "Now, what's cool about all this stupid topology stuff?" you're probably asking. Well, here's the punch line. There are straight lines on this hyperboloid. There are lines that, at every point, are on the hourglass.

Crazy, huh?

http://speeze.pearson.googlepages.com/HyperWithLineGITP.gif

Powered by vBulletin® Copyright © 2014 vBulletin Solutions, Inc. All rights reserved.