A Riddle from 538
March 25, 2016 7:41 AM   Subscribe

 
I've worked this problem out before. It is actually a lot simpler than it appears at first. The answer is "No, because gambling is a sin and if you do it then you will go to hell."
posted by ND¢ at 7:46 AM on March 25, 2016 [5 favorites]


Greg Nog YOU SON OF A BITCH!!!!!!!!!!!
posted by ND¢ at 7:46 AM on March 25, 2016 [58 favorites]


Heh, I just submitted my answer to this. The FiveThirtyEight Riddler is always a highlight of my Friday.
posted by Johnny Assay at 7:47 AM on March 25, 2016


ND¢, as you see brevity is key.
posted by Pendragon at 7:50 AM on March 25, 2016 [2 favorites]


Since the game is in a casino, you will ultimately lose. You should not play it to win money, but spending a limited amount of money you can afford to lose might be worth it for entertainment and excitement, although you'll be supporting an industry that preys on the ignorant and the poor.

That's my answer, as your dad.
posted by bondcliff at 7:55 AM on March 25, 2016 [22 favorites]


No, because the expected value of a casino game is always less than the buy-in. Otherwise they wouldn't offer it.
posted by Holy Zarquon's Singing Fish at 7:55 AM on March 25, 2016 [9 favorites]


Well, I can just fire up Matlab and run a Monte Carlo simulation of -

Well, I’m going to pull rank and ask that you don’t bring laptops to this Riddler.

Ah. Well, I guess I can go grab a few reams of paper from the supply closet instead.
posted by backseatpilot at 7:59 AM on March 25, 2016 [2 favorites]


"Greater than 1."

So if the total is exactly 1, do they draw again or not?
posted by Naberius at 8:03 AM on March 25, 2016 [3 favorites]


Betteridge's law of headlines works for the headline, but the actual question is:

Specifically, what is the expected value of your winnings?

I guess the fact that I went "exactly?" when they gave examples like 0.4 and 0.7 might reveal that I'm a software engineer and not much of a statistician.
posted by effbot at 8:04 AM on March 25, 2016 [4 favorites]


I haven't seen these 538 Riddler puzzles before... they're great! But migosh, are they ever hard!
posted by painquale at 8:21 AM on March 25, 2016


(and just to confirm, I did a simulation to verify that my initial analysis, "the answer is obviously X, eh, probably not since I'm not much of a statistician", was 100% correct. But now that I see the answer, it is kind of obvious, though, so I guess my simulation has a bug...)
posted by effbot at 8:21 AM on March 25, 2016


So if the total is exactly 1, do they draw again or not?

Since the values are uniformly distributed between 0 and 1, the probability of the total being exactly 1 after any given step is infinitesimal, so the answers to the questions posed are the same regardless of whether the house draws again after getting exactly 1.

(Is it kosher to post proposed answers here, or would that be considered a spoiler? I think I've worked it out.)
posted by DevilsAdvocate at 8:24 AM on March 25, 2016 [2 favorites]


I'd be interested to see proposed answers, roughed out a solution that kind of makes sense to me, but I'd like to see other people's methodologies (before embarrassing myself and sharing mine).
posted by sparklemotion at 8:39 AM on March 25, 2016


Since the values are uniformly distributed between 0 and 1, the probability of the total being exactly 1 after any given step is infinitesimal

It depends. If you're using a 1-bit random number generator, it's guaranteed to happen :-)

(the article says "greater than 1" twice, btw)
posted by effbot at 8:41 AM on March 25, 2016


Is it bad form to say what I think the answer is? Too bad, I'm gonna think it through:

You won't get a number greater than 1 on the first draw, so the only way to lose is to go over 1 in two draws. Call the first number drawn x1, and the second x2. The probability of this happening is
P(x1+x2 > 1) = P(x2 > 1-x1)
= 1 - (1-x1) (since it's a uniform distribution)
= x1
And again, since it's a uniform distribution, the expected value of this is 1/2.

So, you're equally likely to win or lose. BUT you can only lose $50 (two drawings, minus the $250 entrance fee), while you can win $50, $150, $250... So the expected value is positive. Still not sure how to find what it is exactly, though. Cheating with Mathematica suggests about $22.
posted by zeptoweasel at 8:44 AM on March 25, 2016 [3 favorites]


My quick, probably wrong back-of-the-envelope calculation says that you've got about a 37% chance of losing $50 and a 62-ish% chance of winning something. I don't dislike those odds.
posted by hanov3r at 8:49 AM on March 25, 2016


Don't overlook the basic fact that one cannot gamble without a bloody mary and a reuben, are those comped or what?
posted by Stonestock Relentless at 8:51 AM on March 25, 2016 [9 favorites]


Make some series of possible results. Notice that taken together their sum is definitely converging on something larger than $250. Assume the answer is going to be the obvious likely candidate number in that area. Enthusiastically play the game. Forget to take care of money management, risk too much. Lose everything. Find a way to borrow a few dollars. Start a casino. Take over the gambling world, because the competition obviously sucks.
posted by sfenders at 8:54 AM on March 25, 2016 [2 favorites]


A narrow solution, answering the question of whether the expected value is over $250 or not, without actually calculating the EV:

Gurer ner nyjnlf ng yrnfg gjb qenjvatf. Pnyy gur svefg gjb erfhygf k naq l. Fb gur cbffvoyr bhgpbzrf ner ercerfragrq ol n bar ol bar fdhner, jurer jurer k vf orgjrra mreb naq bar, naq l vf orgjrra mreb naq bar. Abj, gur tnzr fgbcf vs l vf terngre guna bar zvahf k. Fb vs lbh qenj n qvntbany yvar sebz (mreb, bar) gb (bar, mreb), gur hccre evtug gevnatyr ercerfragf tnzrf gung fgbc nsgre gjb qenjvatf, naq gur ybjre yrsg gevnatyr ercerfragf tnzrf gung xrrc tbvat. Gurfr gevnatyrf unir rdhny nern, fb gur cebonovyvgl gung gur tnzr fgbcf nsgre gjb qenjvatf vf bar unys. Lbh jva gjb uhaqerq qbyynef vs gur tnzr fgbcf nsgre gjb qenjvatf, fb gur rkcrpgrq inyhr sebz gubfr frg bs tnzrf vf bar uhaqerq qbyynef.

Gur cebonovyvgl gung gur tnzr fgbcf nsgre guerr be zber qenjvatf vf gurersber nyfb bar unys. Gur cevmr zbarl sbe gur terngre-guna-gjb-qenjvatf tnzrf vf ng yrnfg guerr uhaqerq qbyynef. Naq gurer vf fbzr cbffvovyvgl bs qenjvat zber guna guerr gvzrf. Gung zrnaf gung gur rkcrpgrq inyhr sbe gurfr tnzrf vf fgevpgyl obhaqrq orybj ol bar unys gvzrf guerr uhaqerq, be bar uhaqerq naq svsgl qbyynef.

Gung zrnaf gung jr xabj gur rkcrpgrq inyhr bs gur tnzr vf fgevpgyl obhaqrq orybj ol bar uhaqerq cyhf bar uhaqerq naq svsgl, be gjb uhaqerq naq svsgl qbyynef.

Fb gur rkcrpgrq inyhr vf pregnvayl uvture guna gung, naq lbh fubhyq cynl gur tnzr.
posted by officer_fred at 8:58 AM on March 25, 2016 [9 favorites]


but which of the numbers is secretly poison
posted by prize bull octorok at 9:04 AM on March 25, 2016 [1 favorite]


Twelve. I've never trusted twelve, it's shifty-looking.
posted by Holy Zarquon's Singing Fish at 9:07 AM on March 25, 2016


Well how long does it take to play?
posted by aubilenon at 9:11 AM on March 25, 2016


I'll favorite officer_fred twice for using ROT13. Avpr!
posted by JoeZydeco at 9:16 AM on March 25, 2016


I'm not sure I can prove this; but for the draw to continue indefinitely the sum of the sequence of drawn numbers must converge on a number less than one. Now, that looks to me a lot like sum((1/(2^n)). Now, this is going to be more handwavey but that implies the odds of the draw stopping on any given draw is also (1/(2^n)), since the value must be under the current sum of draws, and in a uniform distribution between zero and one like the one given here, the odds of a value under a given number are always just the number. So, the odds of the draw continuing is going to be (1 - (1/(2^n))). Then the odds of the draw being below 1 given point should be: Draw 1 (n = 0) = 1; Draw 2 (n = 1) = (1 - (1/(2^n))) = 1 - 0.5 = 0.5; Draw 3 = (1 - (1/(2^n)))*Draw 2 or 0.25 * 0.50 = 0.125; Draw 4 = (1 - (1/(2^n)))*Draw 3 ...

There's probably a nice formula for this; I just slapped it into excel until the negative exponents got to two digits, multiplied by the amount you win if it continues past draw N, and then subtracted 250, since you are guaranteed to loose it. Gives an expected value of $158.42. I was really expecting it to be negative, but that could be an example of clever question design. Since he says 'casino' you're primed for the EV to be negative.

On preview it looks like zeptoweasel was following this logic to a different answer. I've probably done something fairly wrong somewhere along the line.
posted by Grimgrin at 9:16 AM on March 25, 2016




Spoiler with formulas (expected number of picks is equation 10)

Ah, to summarize, the expected number of picks is:
[begin rot13]
r
[end rot13]
posted by aubilenon at 9:27 AM on March 25, 2016 [13 favorites]


On preview, spoiled by bassooner!

OK, looks like you can figure out the expected value using the Irwin-Hall distribution.

The probability of attaining a value less than or equal to 1 after n draws is 1/n! (Plug in x=1 to the CDF formula.) Therefore, the probability of going over 1 exactly on the nth draw is
P(sum of n draws > 1) - P(sum of n-1 draws >1)
= (1 - 1/n!) - (1 - 1/(n-1)!)
= 1/(n-1)! - 1/n!

Therefore, the expected value of your winnings is the sum from 2 (minimum possible number of draws) to infinity of
(1/(n-1)! - 1/n!) * (100n - 250)
which works out to
$ 50*(2e - 5) = $21.82.
posted by zeptoweasel at 9:29 AM on March 25, 2016 [3 favorites]


Spoiler with formulas (expected number of picks is equation 10)

That constant just falls out of it? Math, you are wonderful.
posted by indubitable at 9:50 AM on March 25, 2016 [1 favorite]


No more than $20, same as in town.
posted by Halloween Jack at 10:56 AM on March 25, 2016


I think I probably did something wrong with my napkin math, because my calculation for EV comes out to A Really Very Large Number based on a surprise appearance by the harmonic series.
posted by cortex at 11:05 AM on March 25, 2016 [1 favorite]


What a great problem -- can't believe I've never seen this one before.
posted by escabeche at 11:09 AM on March 25, 2016 [1 favorite]


Based on nothing more than having seen a lot of questions like this (and not having read this thread yet so apologies if someone has already said this), $250/$100 is 2.5 and there's an infinite series in there, so I'm going to reflexively guess that the expected value is 100 * e = 271.828... and say "yes, play it". Now to go and actually think about it.
posted by benito.strauss at 11:09 AM on March 25, 2016 [5 favorites]


benito.strauss, I got as far as infinite series and also "a bit more than $250". Wish I'd made it that bit further.
posted by ambrosen at 11:21 AM on March 25, 2016 [1 favorite]


"Greater than 1."

So if the total is exactly 1, do they draw again or not?


Is 1 greater than 1? No. They'd draw again.
posted by w0mbat at 11:44 AM on March 25, 2016


After submitting my wrong answer I did a crude simulation in Excel. It was pretty easy to work backwards to what is probably the correct analytical formula even if I can't justify it. Oddly my answer was only few bucks off despite being apparently wrongly reasoned.

My valuable honor prevents me from submitting a revised answer.
posted by paper chromatographologist at 12:06 PM on March 25, 2016


So if the total is exactly 1, do they draw again or not?

The likelihood of the total ever being rational is basically zero, so it doesn't really matter.
posted by aubilenon at 12:41 PM on March 25, 2016


cortex, I did probably the same thing and got infinite expected value - I thought at first that the probability of getting a sum < 1 after n draws, i.e. the volume of the region of the unit hypercube (x1, x2, ..., xn) that has x1 + x2 + ... + xn < 1, was 1/n, but it's actually 1/n!.
posted by gold-in-green at 1:02 PM on March 25, 2016 [1 favorite]


I'm a little confused by the set of possible numbers:

It seems to me that the examples imply that they are

{.1, .2, .3, .4, .5, .6, .7, .8, .9}

Because those are the numbers between 0 and 1 that match the examples he gave:

"0.4 and then 0.7"
"0.2, 0.3, 0.3, and then 0.6"

From a game/puzzle standpoint, it seems you have to either assume that the example numbers are showing you the format of the drawn numbers (decimal values less than one, rounded to one decimal place) or that they are deliberately misleading and the set can include the near-infinite number of decimal values between 0 and 1.

Or, am I totally misreading that?
posted by das_2099 at 1:06 PM on March 25, 2016 [2 favorites]


I wrote up a little diffeq-based solution here.
posted by pmdboi at 1:34 PM on March 25, 2016 [3 favorites]


I'm a little confused by the set of possible numbers

I'm not convinced that it would make a huge difference to the expected value whether the set is discrete tenths or the infinite set of real numbers between 0 and 1.

It *might* make a difference if every number could only be drawn once, but the problem explicitly shows an example with 0.2, 0.3, 0.3, and 0.6.

The biggest difference that discrete tenths could make is that you'd end up having a maximum of 11 draws (0.1*10+0.1, to get 1.1 which is greater than 1). But the probability of still being in the game after 11 draws is so small even with a set of real numbers, that it might as well be 0, and not applicable.
posted by sparklemotion at 1:54 PM on March 25, 2016 [1 favorite]


it seems you have to either assume that the example numbers are showing you the format of the drawn numbers (decimal values less than one, rounded to one decimal place) or that they are deliberately misleading and the set can include the near-infinite number of decimal values between 0 and 1.

I'd say that you have to assume that the rules actually state the rules, and the examples will be chosen for simplicity and not to illustrate the whole space of possibilities. Their examples are all numbers you can easily add in your head.

The pithiness of the continuous solution confirms this pretty solidly.

The biggest difference that discrete tenths could make is that you'd end up having a maximum of 11 draws (0.1*10+0.1, to get 1.1 which is greater than 1).

This is more than offset by the fact that you're less likely to bust on your second roll. With tenths, you have a 4/9 chance of only getting $200. I dunno what the cumulative total will be, but I calculated a few small ones. With thirds (1/3, 2/3) the expected return is $287.5. With quarters, it's ~$282.72. I guess with halves (you always get .5) you're guaranteed $300. It seems pretty safe to say that with tenths you'd do a little better than with the continuous game, but not much.
posted by aubilenon at 2:39 PM on March 25, 2016


I independently confirmed zeptoweasel and benito.strauss's answers (which match) by doing the hardest and dumbest possible thing: calculating the probability density functions for the sum of 1, 2 and 3 independent random uniform variables on [0, 1] by convolving their pdfs and extrapolating (read: guessing) from there. The pdfs, being piecewise, got pretty gnarly pretty quickly, as you'd expect, but the pattern was pretty obvious between 0 and 1, which was all I cared about.
posted by valrus at 3:08 PM on March 25, 2016


Isn't the chance of going bust on the first two rolls 75%? Each roll has about 50% chance of being above or below .5 (call it heads or tails). The only way to get a third roll is if the first two are under .5 (HH) whereas HT, TH, and TT are bust.

So you'd have a chance of making a big profit, but 75% of the time you'd lose $50. I don't like those odds.

(I guess the question is different if you're asking whether to play the game a bunch of times or just once.)
posted by straight at 3:47 PM on March 25, 2016 [1 favorite]


Eh. Any casino that ostensibly offers a bet so transparently favourable to me is almost certainly planning to mug me in the parking lot. The correct answer is to report them to the authorities.
posted by langtonsant at 3:59 PM on March 25, 2016 [2 favorites]


Isn't the chance of going bust on the first two rolls 75%? Each roll has about 50% chance of being above or below .5 (call it heads or tails). The only way to get a third roll is if the first two are under .5 (HH) whereas HT, TH, and TT are bust.

.2 + .6 is less than one. .4 + .9 is greater than one. Both cases are equivalent in your model. Which means it's an oversimplification.
posted by aubilenon at 4:48 PM on March 25, 2016 [2 favorites]


straight: "Isn't the chance of going bust on the first two rolls 75%? Each roll has about 50% chance of being above or below .5 (call it heads or tails). The only way to get a third roll is if the first two are under .5 (HH) whereas HT, TH, and TT are bust."

No. 0.6 and 0.2 would give you one tails one head but only add up to 0.8
posted by RobotHero at 4:49 PM on March 25, 2016 [1 favorite]


zawa zawa zawa zawa zawa...
posted by BiggerJ at 7:32 PM on March 25, 2016


It’s weird to find a puzzle like this where the answer is what seems intuitive.

Also, when did they stop putting ROT-13 translation in text editors?
posted by bongo_x at 10:54 PM on March 25, 2016


This was a lot of fun to think about. Thanks!

Here's my contribution to the discussion for what it's worth. The puzzle is about a gamble but then asks about expected value, which assumes an implausible utility function. I wondered what happens when we use some alternative utility functions. If I've done things right, it looks like with a logarithmic utility function, the game is barely worth it:

u(250) = ln(250) = 5.52

u(game) = Sum [n from 1 to inf] ln(100*n) / n*(n-2)! = 5.56

Lots of other reasonable-looking utility functions that I tried out (e.g. functions of the form w(x) = x^a for 0 < a < 1) also made the game worth playing. But with something a bit weirder, like v(x) = 1 - e^(-x/100), the game is no longer a good idea at a $250 buy-in:

v(250) = 1 - e^(-250/100) = 0.918

v(game) = 0.913

So, I guess as with all of these kinds of question: [Shrug] Depends on your utility function?
posted by Jonathan Livengood at 1:52 AM on March 26, 2016 [1 favorite]


JL, I don't understand why you're talking about a utility function? I know stat, but no economics, and I took expected value to refer to the basic stats concept.

I guess in your framework that corresponds to a utility function of the identity, which would make the whole problem trivial. For me the riddle was figuring out the equivalent of equation (9) in basooner's page without looking anything up. Which I was finally able to do by waving my hands around and pointing at the (hyper-)volume under the standard simplex, which probably means I found the same solution as gold-in-green.
posted by benito.strauss at 6:40 AM on March 26, 2016


Alright, so rather then doing smart math, thought I'd just play some games out (in python) to get a feel: Code for winnings after playing ten hands. Running it several times, you're rarely in the negative. I'm assuming 'between 0 and 1' can include 0 and 1. ("Hey, I'm available between Monday and Friday" would indicate Monday and Friday as being options to me)

So that's looking pretty good, if I then iterate the whole game, so we're comparing total winners and losers over several games (e.g. you come one night and sit down and play 10 games, then come back the next day) etc-- you're looking pretty good. 365 days of playing 10 games a day, and you generally walk away a winner about ~300 of those days, with a wallet full of $105k by the end. Play over a year.

But lets say you don't have a bottomless wallet and you get bored easily. So you sit down each day with $250 in your wallet, you'll keep playing for up to ten games if your money lasts, but as soon as it drops below $250, you can't buy into the next game and go home. In that case, you actually walk away a loser much more often, about 215 days of the year, but by the end of the year you're still up in winnings by about $60k Daily budget over a year.

So, I think I'd take the bet-- if you're easily bored, increase your daily budget and you'll raise you winnings by another 50k over the year, but the real money is in just sitting at the table plugging away at more hands, you essentially can't lose with when playing 100 hands with a $1000 budget (>1million in winnings, 1 day down).

That is, assuming I've not made a big dumb error!
posted by Static Vagabond at 11:57 AM on March 26, 2016


Static Vagabond, what's your average winnings per game? If it's near $21-$22 dollars then I'd guess you didn't make a big dumb error.
posted by benito.strauss at 12:09 PM on March 26, 2016


round(random(), 1) won't give you a uniform distribution. Try e.g. randint(0, 10) / 10.0 instead.

(but the quantization will screw things up for you; as noted above, it's not really part of the problem formulation, and the result you get if you don't do that is too elegant for them to have asked for anything else...)
posted by effbot at 3:47 PM on March 26, 2016


benito.strauss,

We mean the same thing by "expected value." Expected utility is (from one perspective, anyway) just a generalization of the idea of expected value. But with expected utility, instead of taking a sum or integral over the product of the nominal value of an outcome and the probability of that outcome, you take the sum or integral over the product of a function of the nominal value (the utility) of an outcome and the probability of that outcome.

The reason to consider expected utility is that if you want to know whether you should play the game for a stake of $250, you need to know how much you actually value $250 versus how much you actually value the game. You could simply use the expected value. But if you have decreasing marginal utility for dollars, if you are risk averse, or some such, then you might think that staying pat with $250 is better than a risky expected value of $272. To illustrate how this might work, suppose you are offered the following choice. You may either have $990,000 for sure or you may have a 1% chance at $100,000,000 and a 99% chance at $0. The expected value of the gamble is an even million dollars, so its expected value is $10,000 greater than the certain option. But almost everyone will choose the certain option in this case and be rational to do so because for most people (basically everyone outside the top one or two percent), the real value of that first near-million is much greater than the real value of the next 99 million. So taking a risk at an enormous payout when there is a very, very large payout in hand is irrational.

When I first looked at the problem, I immediately thought about the St. Petersburg Paradox, and that immediately cues up concerns about the difference between (nominal) value and utility. Really, though, I was just curious about what sorts of utility function would make the game a bad one to invest in for $250. And then having looked a bit, I thought I would share with the group, since the really hard part -- identifying (9) -- was already solved.
posted by Jonathan Livengood at 5:27 PM on March 26, 2016 [2 favorites]


I didn't look carefully, but is a Sicilian involved?
posted by boilermonster at 7:12 PM on March 26, 2016


It's trivial to prove that the game is worth playing : half the time you lose 50 bucks, half the time you gain 50 bucks or more. This is because the average after two draws is 1.0, so that puts you squarely at even odds for losing 50 bucks. The upside however is unbounded, since you could draw a string of small numbers of any length, it's just that longer lengths get less and less probable.
posted by w0mbat at 11:14 AM on March 29, 2016 [1 favorite]


« Older Not blackface but black faces. Well, blackface too...   |   Red Lake County, Minnesota Newer »


This thread has been archived and is closed to new comments