Comments on: Seeing random
http://www.metafilter.com/82053/Seeing-random/
Comments on MetaFilter post Seeing randomSat, 30 May 2009 07:45:37 -0800Sat, 30 May 2009 07:45:37 -0800en-ushttp://blogs.law.harvard.edu/tech/rss60Seeing random
http://www.metafilter.com/82053/Seeing-random
What does randomness look like? <a href="http://www.random-walk.com/">Random Walk </a>asks this question and presents experiments in mathematics and physics, showing the mysterious interaction of chaos and order in randomness. <small>via<a href="http://infosthetics.com/archives/2009/05/random_walk_the_visualization_of_randomness.html"> Information Aethetics</a>, obviously.</small>post:www.metafilter.com,2009:site.82053Sat, 30 May 2009 07:44:20 -0800signalinfopornvisualizationrandomnessprocessingBy: gcbv
http://www.metafilter.com/82053/Seeing-random#2584175
This is written in that "Das Boot" language.comment:www.metafilter.com,2009:site.82053-2584175Sat, 30 May 2009 07:45:37 -0800gcbvBy: signal
http://www.metafilter.com/82053/Seeing-random#2584180
<a href="http://www.random-walk.com/index_en.htm">English version.</a>comment:www.metafilter.com,2009:site.82053-2584180Sat, 30 May 2009 07:49:34 -0800signalBy: escabeche
http://www.metafilter.com/82053/Seeing-random#2584195
Beautiful visualizations of foundational mathematical concepts. Thanks!comment:www.metafilter.com,2009:site.82053-2584195Sat, 30 May 2009 08:08:23 -0800escabecheBy: adipocere
http://www.metafilter.com/82053/Seeing-random#2584203
My favorite thing about random walks was that a one-dimensional random walk will eventually take you to your origin point. So will a two-dimensional random walk. But not in three dimensions.
Really useful stuff in percolation theory.comment:www.metafilter.com,2009:site.82053-2584203Sat, 30 May 2009 08:14:46 -0800adipocereBy: cortex
http://www.metafilter.com/82053/Seeing-random#2584204
Oh, that's gorgeous work.
Note to non-math-dorks: scroll past the enticing thumbnails to the conceptual summaries if you want to know what the hell these pictures are doing. The text is kind of brief for some of the stuff they're discussing, but it at least gives a rough picture of the idea behind each visualization.comment:www.metafilter.com,2009:site.82053-2584204Sat, 30 May 2009 08:14:50 -0800cortexBy: sloe
http://www.metafilter.com/82053/Seeing-random#2584251
cool visualizations -
that's being sent out to my research group -
particularly because of the MC & BD visualizations and the comparison of the pseudo random number generators
really good looking stuff -
but I'm not a huge fan of the interface - I wish each module could be loaded on their own to dedicate more screen space to each image.
I do appreciate the embedded wikipedia links :)comment:www.metafilter.com,2009:site.82053-2584251Sat, 30 May 2009 08:58:46 -0800sloeBy: straight
http://www.metafilter.com/82053/Seeing-random#2584325
The picture for Benford's Law is fantastic. (Although I think they could have included a better explanation for why it works out that way.)
Gosh, I wish you could just look at a big .jpg of these posters. That zoom interface is awful.comment:www.metafilter.com,2009:site.82053-2584325Sat, 30 May 2009 09:56:33 -0800straightBy: barrett caulk
http://www.metafilter.com/82053/Seeing-random#2584347
This is fucking great. I tend to be a visual person, so I always appreciate when abstract concepts are illustrated so clearly and beautifully. I am a non-math-dork, so I will be studying this site all afternoon, then I will discuss randomness with false authority at the surprise birthday party I am attending tonight. Hope there are no real math people there. Nice post, signal.comment:www.metafilter.com,2009:site.82053-2584347Sat, 30 May 2009 10:13:40 -0800barrett caulkBy: cortex
http://www.metafilter.com/82053/Seeing-random#2584373
<i>The picture for Benford's Law is fantastic. (Although I think they could have included a better explanation for why it works out that way.)</i>
Yeah, all the descriptions are pretty slight. Benford's Law is such a wonderful mindfuck that it's a shame it doesn't really cover it in the kind of depth that someone encountering it for the first time should get, but this stuff all feels targeted mostly at folks who know the basics but will appreciate the specific visualization decisions, anyway.comment:www.metafilter.com,2009:site.82053-2584373Sat, 30 May 2009 10:44:37 -0800cortexBy: kid ichorous
http://www.metafilter.com/82053/Seeing-random#2584374
Bravo.comment:www.metafilter.com,2009:site.82053-2584374Sat, 30 May 2009 10:44:37 -0800kid ichorousBy: shoesfullofdust
http://www.metafilter.com/82053/Seeing-random#2584411
<em>Everyday numbers <a href="http://www.mathpages.com/home/kmath302/kmath302.htm" title="Benford's Law">obey</a> a <a href="http://mathworld.wolfram.com/BenfordsLaw.html" title="Benford's Law -- from Wolfram MathWorld">law</a> so <a href="http://www.lacim.uqam.ca/~plouffe/statistics.html" title="Distribution graph of entries in Plouffe's Inverter">unexpected</a> it is hard to believe it's <a href="http://www.fortunecity.com/emachines/e11/86/one.html" title="The Power of One">it's true</a>.</em>
<small>NOTE : there is a total of approximately 350 gigabytes of data in <a href="http://sci.tech-archive.net/Archive/sci.math.research/2009-03/msg00072.html" title="plouffe inverter tables (complete)">there</a>, beware.</small>comment:www.metafilter.com,2009:site.82053-2584411Sat, 30 May 2009 11:29:29 -0800shoesfullofdustBy: signal
http://www.metafilter.com/82053/Seeing-random#2584527
<em>Gosh, I wish you could just look at a big .jpg of these posters. That zoom interface is awful.</em>
The images' URLs are of the form: <em>http://www.random-walk.com/benford/TileGroup0/4-7-2.jpg</em>, where the first number is the zoom level, the second and third x and y, I think. So that, plus Python and PIL, and you're set.comment:www.metafilter.com,2009:site.82053-2584527Sat, 30 May 2009 14:05:11 -0800signalBy: DU
http://www.metafilter.com/82053/Seeing-random#2584543
<i>My favorite thing about random walks was that a one-dimensional random walk will eventually take you to your origin point. So will a two-dimensional random walk. But not in three dimensions.</i>
what
I verified this incredible claim on Wikipedia, but it doesn't really explain what's going on. Reference for someone who can follow some math, but isn't an expert?comment:www.metafilter.com,2009:site.82053-2584543Sat, 30 May 2009 14:25:42 -0800DUBy: escabeche
http://www.metafilter.com/82053/Seeing-random#2584606
<i>I verified this incredible claim on Wikipedia, but it doesn't really explain what's going on. Reference for someone who can follow some math, but isn't an expert?</i>
Sure, I'll give this a go. To keep this short, let's take the following fact as a black box:
(*) If you add up n random numbers, each one of which is +1 or -1 with probability 1/2 each, the probability that the sum is 0 is about 1/sqrt(n).
(If you really want to know why this is true, you can provably check it via binomial theorem + Stirling's formula, or, if you know how to compute standard deviation, you can argue that the standard deviation of the sum is about sqrt(n) and the mean is 0, so it's pretty safe to think of it as something that's almost certainly between -2sqrt(n) and sqrt(n) with an approximately equal probability of landing on any point in that interval. Or you can just leave it as a black box.)
Equivalent to (*) is:
(**) The probability that a 1-dimensional random walk returns to the origin after exactly n steps is about 1/sqrt(n).
In the k-dimensional random walk, you return to the origin after n steps if and only if each coordinate is 0 after n steps; according to (**), the chance that the first coordinate is 0 is about 1/sqrt(n), that the second coordinate is 0 is also about 1/sqrt(n), and so on. Since the different coordinates are independent from each other, we conclude
(***) The probability that a k-dimensional random walk returns to the origin after exactly n steps is about (1/sqrt(n))^k, or n^(-k/2).
Thus:
(*4) The expected number of returns to the origin of an N-step k-dimensional random walk is about [sum from n = 1 to N] n^{-k/2}.
When k = 1, the sum from n = 1 to N of n^{-1/2} is about sqrt(N); this number grows with N, which is to say that as the walk goes on and on, you expect to return to the origin again and again, and reasonably often. When k = 2, the sum from n = 1 to N of n^{-1} is about log N; this too grows with N, so we expect to keep coming back to the origin, but it is going to be a LOT longer between returns. If k > 2, the sum of n^{-k/2} <em>converges</em>; which is to say that the total number of returns to the origin doesn't get larger and larger as N grows. Which is to say that you expect that the walk at some point leaves the origin and never, ever comes back.
So the key point is the distinction between convergent and divergent infinite series, which I'm not sure whether you're comfortable with; keyword here is the "integral test" but you will need a tiny bit of calculus.comment:www.metafilter.com,2009:site.82053-2584606Sat, 30 May 2009 15:11:48 -0800escabecheBy: xorry
http://www.metafilter.com/82053/Seeing-random#2584735
That's awesome escabeche.comment:www.metafilter.com,2009:site.82053-2584735Sat, 30 May 2009 19:03:10 -0800xorryBy: adipocere
http://www.metafilter.com/82053/Seeing-random#2584774
Probability that you will return to your origin point on a random walk, for a given number of dimensions:
<pre>
0 1 (where do you think you're going?)
1 1
2 1
3 0.340537
4 0.193206
5 0.135178
6 0.104715
7 0.085844
8 0.072913
</pre>
Kids, don't get Lost In Space. But most especially hyperspace.comment:www.metafilter.com,2009:site.82053-2584774Sat, 30 May 2009 20:36:46 -0800adipocereBy: l'esprit d'escalier
http://www.metafilter.com/82053/Seeing-random#2584976
Some of the images such as <a href="http://www.random-walk.com/pic/matt_02b.jpg">this one</a> or <a href="http://www.random-walk.com/pic/matt_06b.jpg">this one</a> totally describe the page layout of Japanese newspapers! I always wondered about that, but seeing this I reckon the paginators are striving every day to achieve randomness without alienating the reader.comment:www.metafilter.com,2009:site.82053-2584976Sun, 31 May 2009 02:34:41 -0800l'esprit d'escalierBy: sfts2
http://www.metafilter.com/82053/Seeing-random#2584992
Math, thats some cool shit, right there.comment:www.metafilter.com,2009:site.82053-2584992Sun, 31 May 2009 04:03:12 -0800sfts2By: albrecht
http://www.metafilter.com/82053/Seeing-random#2585551
<i>Benford's Law is such a wonderful mindfuck that it's a shame it doesn't really cover it in the kind of depth that someone encountering it for the first time should get, but this stuff all feels targeted mostly at folks who know the basics but will appreciate the specific visualization decisions, anyway.</i>
For those who are looking for one, there's a cool explanation of Benford's Law (as it pertains to tax fraud) on <a href="http://doctormath.blogspot.com/">Ask Doctor Math</a>.comment:www.metafilter.com,2009:site.82053-2585551Sun, 31 May 2009 14:15:51 -0800albrechtBy: DU
http://www.metafilter.com/82053/Seeing-random#2585812
<i>Probability that you will return to your origin point on a random walk, for a given number of dimensions:</i>
First reaction: Wow.
Second: Wait, what does that .340537 really mean. 34 times out of every hundred...what. Infinitely long random walks?
Third: Oh, he just left out N. Probably 100 or something.
Fourth: No, because then how are the 1D/2D cases 1?
Fifth: Must be an infinite integral? I'm still having a bit of trouble visualizing what probability even means in an infinitely long experiment.comment:www.metafilter.com,2009:site.82053-2585812Sun, 31 May 2009 18:20:05 -0800DUBy: adipocere
http://www.metafilter.com/82053/Seeing-random#2587319
DU:
First reaction: Yeah, that always messed me up.
Second: No, it means 34 times out of every hundred infinitely long random walks.
Third-Fourth-Fifth: The "1" is a shorthand which says, "As time goes to infinity, the probability approaches unity." "Unity," as it was used where I was educated, was represented as "1," which is the same thing as 100%, but as a concept, "unity" seems to be used in the context of summing up (uniting) a number of individual, subcategorized probabilities in such a fashion that you got a "1" out of the whole mess.
Some things approach unity rapidly, such as the one-dimensional case. A drunk leaving home is two-dimensional, but he eventually makes it back, just not as fast. Ah, but only a little more than a third of drunken birds make it back to their nests floating in space.
And then if you were in the World's Least Useful Time Machine, where you were locked in, then it randomly kicked you one second in the future or past, or one light-second in any one of the three dimensions, over and over again, and if this somehow sparked off some kind of ghastly parallel worlds thing on top of it, a little less than one-fifth of the versions of you launched from the original point would come back to your original spacetime.
Some of those versions would be dead due to suffocation, dehydration, starvation, suicide, and old age by the time they got there. This sounds like grounds for a particularly cruel simulation.comment:www.metafilter.com,2009:site.82053-2587319Mon, 01 Jun 2009 13:49:49 -0800adipocere