Skip
# Seeing random

Yeah, all the descriptions are pretty slight. Benford's Law is such a wonderful mindfuck that it's a shame it doesn't really cover it in the kind of depth that someone encountering it for the first time should get, but this stuff all feels targeted mostly at folks who know the basics but will appreciate the specific visualization decisions, anyway.

posted by cortex at 10:44 AM on May 30, 2009

NOTE : there is a total of approximately 350 gigabytes of data in there, beware.

posted by shoesfullofdust at 11:29 AM on May 30, 2009 [3 favorites]

The images' URLs are of the form:

posted by signal at 2:05 PM on May 30, 2009

what

I verified this incredible claim on Wikipedia, but it doesn't really explain what's going on. Reference for someone who can follow some math, but isn't an expert?

posted by DU at 2:25 PM on May 30, 2009

Sure, I'll give this a go. To keep this short, let's take the following fact as a black box:

(*) If you add up n random numbers, each one of which is +1 or -1 with probability 1/2 each, the probability that the sum is 0 is about 1/sqrt(n).

(If you really want to know why this is true, you can provably check it via binomial theorem + Stirling's formula, or, if you know how to compute standard deviation, you can argue that the standard deviation of the sum is about sqrt(n) and the mean is 0, so it's pretty safe to think of it as something that's almost certainly between -2sqrt(n) and sqrt(n) with an approximately equal probability of landing on any point in that interval. Or you can just leave it as a black box.)

Equivalent to (*) is:

(**) The probability that a 1-dimensional random walk returns to the origin after exactly n steps is about 1/sqrt(n).

In the k-dimensional random walk, you return to the origin after n steps if and only if each coordinate is 0 after n steps; according to (**), the chance that the first coordinate is 0 is about 1/sqrt(n), that the second coordinate is 0 is also about 1/sqrt(n), and so on. Since the different coordinates are independent from each other, we conclude

(***) The probability that a k-dimensional random walk returns to the origin after exactly n steps is about (1/sqrt(n))^k, or n^(-k/2).

Thus:

(*4) The expected number of returns to the origin of an N-step k-dimensional random walk is about [sum from n = 1 to N] n^{-k/2}.

When k = 1, the sum from n = 1 to N of n^{-1/2} is about sqrt(N); this number grows with N, which is to say that as the walk goes on and on, you expect to return to the origin again and again, and reasonably often. When k = 2, the sum from n = 1 to N of n^{-1} is about log N; this too grows with N, so we expect to keep coming back to the origin, but it is going to be a LOT longer between returns. If k > 2, the sum of n^{-k/2}

So the key point is the distinction between convergent and divergent infinite series, which I'm not sure whether you're comfortable with; keyword here is the "integral test" but you will need a tiny bit of calculus.

posted by escabeche at 3:11 PM on May 30, 2009 [13 favorites]

For those who are looking for one, there's a cool explanation of Benford's Law (as it pertains to tax fraud) on Ask Doctor Math.

posted by albrecht at 2:15 PM on May 31, 2009

First reaction: Wow.

Second: Wait, what does that .340537 really mean. 34 times out of every hundred...what. Infinitely long random walks?

Third: Oh, he just left out N. Probably 100 or something.

Fourth: No, because then how are the 1D/2D cases 1?

Fifth: Must be an infinite integral? I'm still having a bit of trouble visualizing what probability even means in an infinitely long experiment.

posted by DU at 6:20 PM on May 31, 2009

Post

# Seeing random

May 30, 2009 7:44 AM Subscribe

What does randomness look like? Random Walk asks this question and presents experiments in mathematics and physics, showing the mysterious interaction of chaos and order in randomness. via Information Aethetics, obviously.

Beautiful visualizations of foundational mathematical concepts. Thanks!

posted by escabeche at 8:08 AM on May 30, 2009

posted by escabeche at 8:08 AM on May 30, 2009

My favorite thing about random walks was that a one-dimensional random walk will eventually take you to your origin point. So will a two-dimensional random walk. But not in three dimensions.

Really useful stuff in percolation theory.

posted by adipocere at 8:14 AM on May 30, 2009

Really useful stuff in percolation theory.

posted by adipocere at 8:14 AM on May 30, 2009

Oh, that's gorgeous work.

Note to non-math-dorks: scroll past the enticing thumbnails to the conceptual summaries if you want to know what the hell these pictures are doing. The text is kind of brief for some of the stuff they're discussing, but it at least gives a rough picture of the idea behind each visualization.

posted by cortex at 8:14 AM on May 30, 2009

Note to non-math-dorks: scroll past the enticing thumbnails to the conceptual summaries if you want to know what the hell these pictures are doing. The text is kind of brief for some of the stuff they're discussing, but it at least gives a rough picture of the idea behind each visualization.

posted by cortex at 8:14 AM on May 30, 2009

cool visualizations -

that's being sent out to my research group -

particularly because of the MC & BD visualizations and the comparison of the pseudo random number generators

really good looking stuff -

but I'm not a huge fan of the interface - I wish each module could be loaded on their own to dedicate more screen space to each image.

I do appreciate the embedded wikipedia links :)

posted by sloe at 8:58 AM on May 30, 2009

that's being sent out to my research group -

particularly because of the MC & BD visualizations and the comparison of the pseudo random number generators

really good looking stuff -

but I'm not a huge fan of the interface - I wish each module could be loaded on their own to dedicate more screen space to each image.

I do appreciate the embedded wikipedia links :)

posted by sloe at 8:58 AM on May 30, 2009

The picture for Benford's Law is fantastic. (Although I think they could have included a better explanation for why it works out that way.)

Gosh, I wish you could just look at a big .jpg of these posters. That zoom interface is awful.

posted by straight at 9:56 AM on May 30, 2009

Gosh, I wish you could just look at a big .jpg of these posters. That zoom interface is awful.

posted by straight at 9:56 AM on May 30, 2009

This is fucking great. I tend to be a visual person, so I always appreciate when abstract concepts are illustrated so clearly and beautifully. I am a non-math-dork, so I will be studying this site all afternoon, then I will discuss randomness with false authority at the surprise birthday party I am attending tonight. Hope there are no real math people there. Nice post, signal.

posted by barrett caulk at 10:13 AM on May 30, 2009

posted by barrett caulk at 10:13 AM on May 30, 2009

*The picture for Benford's Law is fantastic. (Although I think they could have included a better explanation for why it works out that way.)*

Yeah, all the descriptions are pretty slight. Benford's Law is such a wonderful mindfuck that it's a shame it doesn't really cover it in the kind of depth that someone encountering it for the first time should get, but this stuff all feels targeted mostly at folks who know the basics but will appreciate the specific visualization decisions, anyway.

posted by cortex at 10:44 AM on May 30, 2009

Bravo.

posted by kid ichorous at 10:44 AM on May 30, 2009 [1 favorite]

posted by kid ichorous at 10:44 AM on May 30, 2009 [1 favorite]

*Everyday numbers obey a law so unexpected it is hard to believe it's it's true.*

NOTE : there is a total of approximately 350 gigabytes of data in there, beware.

posted by shoesfullofdust at 11:29 AM on May 30, 2009 [3 favorites]

*Gosh, I wish you could just look at a big .jpg of these posters. That zoom interface is awful.*

The images' URLs are of the form:

*http://www.random-walk.com/benford/TileGroup0/4-7-2.jpg*, where the first number is the zoom level, the second and third x and y, I think. So that, plus Python and PIL, and you're set.

posted by signal at 2:05 PM on May 30, 2009

*My favorite thing about random walks was that a one-dimensional random walk will eventually take you to your origin point. So will a two-dimensional random walk. But not in three dimensions.*

what

I verified this incredible claim on Wikipedia, but it doesn't really explain what's going on. Reference for someone who can follow some math, but isn't an expert?

posted by DU at 2:25 PM on May 30, 2009

*I verified this incredible claim on Wikipedia, but it doesn't really explain what's going on. Reference for someone who can follow some math, but isn't an expert?*

Sure, I'll give this a go. To keep this short, let's take the following fact as a black box:

(*) If you add up n random numbers, each one of which is +1 or -1 with probability 1/2 each, the probability that the sum is 0 is about 1/sqrt(n).

(If you really want to know why this is true, you can provably check it via binomial theorem + Stirling's formula, or, if you know how to compute standard deviation, you can argue that the standard deviation of the sum is about sqrt(n) and the mean is 0, so it's pretty safe to think of it as something that's almost certainly between -2sqrt(n) and sqrt(n) with an approximately equal probability of landing on any point in that interval. Or you can just leave it as a black box.)

Equivalent to (*) is:

(**) The probability that a 1-dimensional random walk returns to the origin after exactly n steps is about 1/sqrt(n).

In the k-dimensional random walk, you return to the origin after n steps if and only if each coordinate is 0 after n steps; according to (**), the chance that the first coordinate is 0 is about 1/sqrt(n), that the second coordinate is 0 is also about 1/sqrt(n), and so on. Since the different coordinates are independent from each other, we conclude

(***) The probability that a k-dimensional random walk returns to the origin after exactly n steps is about (1/sqrt(n))^k, or n^(-k/2).

Thus:

(*4) The expected number of returns to the origin of an N-step k-dimensional random walk is about [sum from n = 1 to N] n^{-k/2}.

When k = 1, the sum from n = 1 to N of n^{-1/2} is about sqrt(N); this number grows with N, which is to say that as the walk goes on and on, you expect to return to the origin again and again, and reasonably often. When k = 2, the sum from n = 1 to N of n^{-1} is about log N; this too grows with N, so we expect to keep coming back to the origin, but it is going to be a LOT longer between returns. If k > 2, the sum of n^{-k/2}

*converges*; which is to say that the total number of returns to the origin doesn't get larger and larger as N grows. Which is to say that you expect that the walk at some point leaves the origin and never, ever comes back.

So the key point is the distinction between convergent and divergent infinite series, which I'm not sure whether you're comfortable with; keyword here is the "integral test" but you will need a tiny bit of calculus.

posted by escabeche at 3:11 PM on May 30, 2009 [13 favorites]

Probability that you will return to your origin point on a random walk, for a given number of dimensions:

posted by adipocere at 8:36 PM on May 30, 2009

0 1 (where do you think you're going?) 1 1 2 1 3 0.340537 4 0.193206 5 0.135178 6 0.104715 7 0.085844 8 0.072913Kids, don't get Lost In Space. But most especially hyperspace.

posted by adipocere at 8:36 PM on May 30, 2009

Some of the images such as this one or this one totally describe the page layout of Japanese newspapers! I always wondered about that, but seeing this I reckon the paginators are striving every day to achieve randomness without alienating the reader.

posted by l'esprit d'escalier at 2:34 AM on May 31, 2009

posted by l'esprit d'escalier at 2:34 AM on May 31, 2009

*Benford's Law is such a wonderful mindfuck that it's a shame it doesn't really cover it in the kind of depth that someone encountering it for the first time should get, but this stuff all feels targeted mostly at folks who know the basics but will appreciate the specific visualization decisions, anyway.*

For those who are looking for one, there's a cool explanation of Benford's Law (as it pertains to tax fraud) on Ask Doctor Math.

posted by albrecht at 2:15 PM on May 31, 2009

*Probability that you will return to your origin point on a random walk, for a given number of dimensions:*

First reaction: Wow.

Second: Wait, what does that .340537 really mean. 34 times out of every hundred...what. Infinitely long random walks?

Third: Oh, he just left out N. Probably 100 or something.

Fourth: No, because then how are the 1D/2D cases 1?

Fifth: Must be an infinite integral? I'm still having a bit of trouble visualizing what probability even means in an infinitely long experiment.

posted by DU at 6:20 PM on May 31, 2009

DU:

First reaction: Yeah, that always messed me up.

Second: No, it means 34 times out of every hundred infinitely long random walks.

Third-Fourth-Fifth: The "1" is a shorthand which says, "As time goes to infinity, the probability approaches unity." "Unity," as it was used where I was educated, was represented as "1," which is the same thing as 100%, but as a concept, "unity" seems to be used in the context of summing up (uniting) a number of individual, subcategorized probabilities in such a fashion that you got a "1" out of the whole mess.

Some things approach unity rapidly, such as the one-dimensional case. A drunk leaving home is two-dimensional, but he eventually makes it back, just not as fast. Ah, but only a little more than a third of drunken birds make it back to their nests floating in space.

And then if you were in the World's Least Useful Time Machine, where you were locked in, then it randomly kicked you one second in the future or past, or one light-second in any one of the three dimensions, over and over again, and if this somehow sparked off some kind of ghastly parallel worlds thing on top of it, a little less than one-fifth of the versions of you launched from the original point would come back to your original spacetime.

Some of those versions would be dead due to suffocation, dehydration, starvation, suicide, and old age by the time they got there. This sounds like grounds for a particularly cruel simulation.

posted by adipocere at 1:49 PM on June 1, 2009

First reaction: Yeah, that always messed me up.

Second: No, it means 34 times out of every hundred infinitely long random walks.

Third-Fourth-Fifth: The "1" is a shorthand which says, "As time goes to infinity, the probability approaches unity." "Unity," as it was used where I was educated, was represented as "1," which is the same thing as 100%, but as a concept, "unity" seems to be used in the context of summing up (uniting) a number of individual, subcategorized probabilities in such a fashion that you got a "1" out of the whole mess.

Some things approach unity rapidly, such as the one-dimensional case. A drunk leaving home is two-dimensional, but he eventually makes it back, just not as fast. Ah, but only a little more than a third of drunken birds make it back to their nests floating in space.

And then if you were in the World's Least Useful Time Machine, where you were locked in, then it randomly kicked you one second in the future or past, or one light-second in any one of the three dimensions, over and over again, and if this somehow sparked off some kind of ghastly parallel worlds thing on top of it, a little less than one-fifth of the versions of you launched from the original point would come back to your original spacetime.

Some of those versions would be dead due to suffocation, dehydration, starvation, suicide, and old age by the time they got there. This sounds like grounds for a particularly cruel simulation.

posted by adipocere at 1:49 PM on June 1, 2009

« Older A Broken Trust: Lessons from the Vaccineâ€“Autism... | Living With Proteus Syndrome Newer »

This thread has been archived and is closed to new comments

posted by gcbv at 7:45 AM on May 30, 2009