01010010 01001001 01010000
May 6, 2008 5:57 AM   Subscribe

If you are reading this post on a computer attached to the Internet, you can thank Claude Shannon (1916-2001). It was his work, starting with A Mathematical Theory of Communication, that first enabled humans to extract digital perfection from the analog world by creating the field of Information Theory. Like most computer nerds of his day, who often had to program their computers by moving wires around or even mechanical linkages, he was also an electronics and mechanical whiz who could create a juggling robot and The Ultimate Machine.
posted by DU (35 comments total) 22 users marked this as a favorite
 
Shannon's work gave rise to the idea of a perfect graph in graph theory, which led to the Strong Perfect Graph Conjecture Theorem - the upgrade of which from conjecture to theorem is one of the most impressive but unremarked feats in recent mathematical research. An account by Paul Seymour (pdf).
posted by Wolfdog at 6:04 AM on May 6, 2008


Pre-read through,.. this is reminding me of neal stephenson's talk about pipe-organs in cryptonomicon.
posted by MNDZ at 6:05 AM on May 6, 2008


Shannon's work gave rise to the idea...

Shannon's work gave rise to a lot of ideas, from computers to games to black holes. I tried not to get too deeply into that in the post, since I was actually inspired to write this because of the robot. But additional links are certainly not discouraged.
posted by DU at 6:13 AM on May 6, 2008


Glad you don't disapprove too strongly.
posted by Wolfdog at 6:18 AM on May 6, 2008


I want that box now. It would be awesome for parties. Some nosy person would try to look into it and instantly be freaked out by a hand closing it. That would teach them to look in boxes that don't belong to them!
posted by Mastercheddaar at 6:28 AM on May 6, 2008 [1 favorite]


Amazing stuff. Thanks for this, DU!
posted by Dizzy at 6:28 AM on May 6, 2008


Sorry, Wolfdog, a little residual of the crabbiness I always get from reading Mathworld seeped through. Working with practical mathematicians (i.e. computer programmers, engineers and scientists) makes me forget how much "real" mathematicians value elegance over clarity.
posted by DU at 6:38 AM on May 6, 2008


Wolfdog, is there a description of a perfect graph where, oh, the average member of Mensa could understand it despite their quaint notion that a graph is that thing you make in Excel that shows how variable one correlates to variable two? Right now I feel like I've had a guy who looks just like Douglas Hofstadter, only with a black goatee, jump out of the bushes, beat me into a stupor and then explain chaos theory to me.
posted by Kid Charlemagne at 6:40 AM on May 6, 2008 [1 favorite]


Damn clever title, DU.
posted by JaredSeth at 6:53 AM on May 6, 2008


The graphs in question here are just sets of points (vertices) some of which are connected to one another by lines (edges). Being "perfect" is a coloring property, and it's a little bit subtle, but here's a quick run down:

1. Graph theorists (and their friends, the mapmakers) like to look at a graph and figure out the smallest number of different colors they need so that they can give each vertex a color and no vertex is next to another one of the same color. Same way mapmakers don't want two adjacent regions to have same color. Of course, if you just used a different color for every vertex then you'd have the "no two adjacent with the same color" problem licked - but that's usually far more colors than are strictly needed to do the job.

2. If somewhere in your graph there's a set of, oh, say, k many vertices which are all connected to one another by edges, then that's called a clique of size k, and it's gonna take at least k many different colors to color your graph. So in fact look for the biggest clique in your graph, and it's going to take at least that many colors.

3. It might take more than that. Draw 5 vertices in a circular arrangement like 5 people at dinner holding hands. There's no clique bigger than 2 vertices/people, but you'll find very quickly it takes 3 colors.

4. So if the largest clique in the graph actually tells you exactly how many colors are needed to do the coloring legitimately, your graph is kind of special.

5. A perfect graph is one that is not only special (in jargon, "its chromatic number equals its clique number"), but every subgraph of it has that special property too. It is this "inheritance" aspect which makes perfectness a subtle and very, very, very frustrating property to deal with.

6. But the Strong Perfect Graph Theorem gives a very, very, very simple test to determine if your graph is perfect: remember that 5-cycle arrangement from (3) above? Well, actually any odd number of vertices in a circle would have the same defect that the largest clique has size 2 but 3 colors are required for the graph. And SPGT says that odd cycles (and their complements) are the only barrier to being perfect. If your graph is free of subgraphs that look like that, then it's perfect, and if it isn't it ain't.

7. Why this is related to information theory is a story for another day which is gonna have more beer in it.
posted by Wolfdog at 6:56 AM on May 6, 2008 [4 favorites]


"the juggler's skills cannot yet be engineered. To build a device that could imitate him, Dr. Shannon had to simplify the problem."

Odd way of thinking about engineering. I'd have gone with "have not yet been engineered." Seems like the video was taking the perspective that there are tasks which we can engineer, and tasks that we cannot yet engineer, but give it time and we will.

I don't know if I have ever encountered that attitude. Sounds like the world of science is new and fascinating and, someday soon, all things will be solvable by technology, but right now there are certain things that simply cannot be engineered.
posted by lostburner at 7:06 AM on May 6, 2008


I feel like I've had a guy who looks just like Douglas Hofstadter, only with a black goatee, jump out of the bushes, beat me into a stupor and then explain chaos theory to me.
Incidentally, I personally would describe that as "my best day ever", but I recognize that's not the default position.
posted by Wolfdog at 7:12 AM on May 6, 2008 [1 favorite]


Working with practical mathematicians (i.e. computer programmers, engineers and scientists) makes me forget how much "real" mathematicians value elegance over clarity.

Just because you don't understand it, doesn't mean it's not clear.
posted by TypographicalError at 7:16 AM on May 6, 2008


...tasks that we cannot yet engineer, but give it time and we will.

Given that evolution has engineered such a system in this case, I think this is reasonable.

Just because you don't understand it, doesn't mean it's not clear.

I would say that if the reader has a basic understanding of a topic, but that a few sentences purporting to explain something on that topic leave them confused and uninformed, those few sentences could use some work. Or: Just because you do understand it, doesn't mean it is clear.
posted by DU at 7:44 AM on May 6, 2008



"the juggler's skills cannot yet be engineered. To build a device that could imitate him, Dr. Shannon had to simplify the problem."

Odd way of thinking about engineering. I'd have gone with "have not yet been engineered."


I think the two are distinguishable. "Cannot yet be engineered" implies that some component of the engineering process has not been developed yet or does not yet exist. E.g., we lack the correct materials don't exist yet, or an automated process to monitor the flight of the juggled balls does not yet exist.

"Have not yet been engineered" implies that everything to engineer it exists, but no one has bothered to build the thing yet. A robot squid has not yet been engineered, but it could easily be done. An intelligent robot cannot yet be engineered.
posted by Pastabagel at 7:50 AM on May 6, 2008


I have been considering a Shannon post for awhile, and I wanted to incorporate two things:
1) His machines. (A whole collection is at the MIT Museum, viewable online here)
2) Shannon's solution to the problem of Maxwell's Demon
Also, I love the fact that his master's thesis is considered the most influential thesis ever written. Next to mine.

Great post, DU!
posted by blahblahblah at 8:32 AM on May 6, 2008


Whoa, I wanted to link to his machines too, but I figured there wouldn't be anything but text mentions and didn't even search. Are these just on the webmuseum or are they at the actual physical museum too? I took my kids to see the Ganson exhibit, which rules, but didn't notice any Shannon stuff there.
posted by DU at 8:45 AM on May 6, 2008


I respect people in the field of communication information theory. I can think of few things that make me want to jam a screwdriver into my eyesocket more than that.
posted by spiderskull at 8:46 AM on May 6, 2008


(Oh holy crap, this is the same Shannon as the Shannon Expansion? What hasn't he done?!)
posted by spiderskull at 8:48 AM on May 6, 2008


OIC, the Shannon stuff is part of this Mark Epstein guy's exhibit. I'll definitely have to go back. Thanks for the tip!
posted by DU at 8:49 AM on May 6, 2008


(Oh holy crap, this is the same Shannon as the Shannon Expansion? What hasn't he done?!)

Shannon was also the co-creator of the first wearable computer, and proved that the problem of designing electrical switching circuits is equivalent to Boolean algebra.

(When I took my basic electrical engineering courses, the equivalence of basic circuits to symbolic logic was so ingrained in the process, I had to pause for thought when I read that someone had to discover it. And then, I had to pause again when I learned that he discovered it in writing his master's thesis.)
posted by CrunchyFrog at 9:04 AM on May 6, 2008 [1 favorite]


There are few scientific theories which are more important and less well known than Information Theory.

For me what's most amazing about it is that before Shannon's paper, Information Theory didn't exist. After Shannon's paper, it was complete and mature and he had already rendered it to engineering practice. For many theories that process takes decades and the work of hundreds.
posted by Class Goat at 9:12 AM on May 6, 2008 [1 favorite]


There are few scientific theories which are more important and less well known than Information Theory.

It's also one of those theories, like natural selection, that once you know about it, you start seeing it everywhere.
posted by DU at 9:22 AM on May 6, 2008


In giving credit where credit is due, it was my great-uncle (last link there is a self-link) that proposed (and implemented) we apply such theories to interconnect the installations of his SAGE project by sending digital transmissions over an analog-communications medium, like the telephone.
posted by thanotopsis at 9:54 AM on May 6, 2008


The set of items linked by blahblahblah are only those on display on the first floor of the MIT Museum (the new Mark Epstein gallery); here's the list of the full collection of Shannon items held by the museum. The Shannon-Thorp wearable computer is currently on display at the Heinz Nixdorf Museum in Paderborn, as part of their exhibit on numbers and gambling.
posted by nonane at 10:10 AM on May 6, 2008


Shannon's work was incredibly brilliant in that it connected something as mindblowingly abstract as the idea of information ("how complicated does this song sound?") and linked it to something PHYSICAL ("how many joules do I need to apply to the wire?").

I had a communications systems prof give the following simple argument of why digital storage (not even considering digital computation) trumps analog storage. Suppose you would like to copy a song. If you copy the song to a CD, you now have a digital representation for the song (in bits). You can replicate those bits *exactly*. No matter how many times you copy the song, its quality will *never* degrade, assuming you can copy the bits exactly. If you have the song on tape, every time you copy the tape, you add white noise to the song. This noise is additive, and the amount of noise depends on the quality of your hardware. Nevertheless, each copy's quality (SNR, or signal-to-noise-ratio) decreases. When we store a song in digital form, we incur a one-time quality loss due to quantization (and in fact, sometimes not even that: consider new lossless audio compression schemes). After this initial step, assuming your hardware is noise robust and you can correct for small amounts of analog error (any CD player made in the last 20 years can), you never suffer any quality loss on repeat copies. All of the techniques used to store the song in digital form, to read it off the CD without error, and to store a copy (again making it robust to deficiencies in the analog hardware), are outgrowths of the theory that Shannon invented. Shannon said, "Here's how well you can do," and most of signal processing, both theory and application, since 1949, has basically been trying to answer the question of "how am I going to do that well in this particular case?". Some particular cases: Reading/writing off an optical or magnetic medium like the CD or hard drive. Sending bits via modem over the telephone cable. Sending audio and video via satellite. Sending a digital representation of your voice between your cell phone and a nearby cell tower. Sending the result of a computation on your computer's processor to the cache.

One particularly interesting area of study is in how nature answered this question in sending the information about what your eyes see to your brain. (For those more mathematically inclined, take a look at the paper by Bill Bialek linked to from here).

As an aside:

Shannon's master's thesis, though less well known among information theorists, may have been just as important. In fact, every device you use that requires calculation (read: computer, cell phone, etc.) works the way it does thanks to the following brilliant piece of work:

A Symbolic Analysis of Relay and Switching Circuits

in which Shannon formalized the relationship between relay circuits and boolean algebra. Since all calculation (integer arithmetic, testing of conditions, etc) can be represented using boolean logic, this particular piece of work basically formalized and streamlined the step that takes a logical problem (e.g., calculate 3+5, or if/else/endif) and turns it into a digital computer circuit. Thus paving the way for the digital computer.
posted by zonem at 10:13 AM on May 6, 2008 [1 favorite]


My first day of class in Information Theory the professor asks:
"How many of you know Michael Jordan?" Overwhelming majority affirms
"How many of you know Walter Kronkite?" Overwhelming majority again
"How many of you know Neil Armstrong?" overwhelming again
"How many of you know Claude Shannon?" Barely anybody affirms
He pauses and says soberly "Claude Shannon is to his field what those other luminaries are to theirs, and yet you in this room that aspire to his field know of the significant contemporary leaders is Sports, News, and Exploration, but not of the most profound influencer of your chosen field." He pauses ... "Your homework is to ponder on that."*

Personally, it was Shannon's Limit that awestruck me ... what do you mean I can't get more information in that channel? I'll just goto umpty-umpteen-ary QAM and .... oh, what's that about the channel noise preventing me from accurately decoding? Damn that Shannon guy ... killing my dreams of HDTV over 28.8k modem lines.


*I've paraphrased a bit
posted by forforf at 10:50 AM on May 6, 2008


He was kind of like Feynman/ Quincy Jones/ Brian Eno in that his reach is far, far wider than is at all obvious. I read a really good book about him a couple years ago, Fortune's Formula, the title is nowhere near as good or interesting as the book.

Sweet post.
posted by From Bklyn at 10:59 AM on May 6, 2008


Absolutely fantastic. Thanks for the post!
posted by tickingclock at 12:03 PM on May 6, 2008


My home storage server is called THROBAC.

Excellent post, thank you.
posted by Skorgu at 12:26 PM on May 6, 2008


Excellent post - I just wrote a term paper in my Coding Theory class on Algorithmic Information Theory, which builds off of Shannon's work to express information content as the length of the shortest program that can produce a given output. It's fascinating stuff - I'd recommend Greg Chaitin's Meta Math! The Quest for Omega if you're at all interested. I found it pretty fascinating and it's accessible even to motivated highschoolers.
posted by Earl the Polliwog at 2:07 PM on May 6, 2008


Personally, it was Shannon's Limit that awestruck me ... what do you mean I can't get more information in that channel? I'll just goto umpty-umpteen-ary QAM and .... oh, what's that about the channel noise preventing me from accurately decoding? Damn that Shannon guy ... killing my dreams of HDTV over 28.8k modem lines.

What I found interesting about Shannon's Limit is that you can approach it with arbitrarily small probability of error, even in a channel that introduces errors randomly. That somehow, up to slower communication, it doesn't matter.
posted by Earl the Polliwog at 2:11 PM on May 6, 2008


What I found interesting about Shannon's Limit is that you can approach it with arbitrarily small probability of error, even in a channel that introduces errors randomly. That somehow, up to slower communication, it doesn't matter.

Information theory is all about thermodynamics. This was Shannon's real insight. Bit error can never be zero because random thermal noise is always present.
posted by three blind mice at 4:45 PM on May 6, 2008


Chaitin should be right up my alley, but he reads too kooky for me.
posted by DU at 6:58 PM on May 6, 2008


There are a number of statues of Claude Shannon, dedicated starting shortly before his death in 2001. A co-worker mentioned that there was a statue in Shannon's home town, and I stopped by on my way through that part of the state... the statue was missing. A few months later, it had returned...

Pointless story? *shrug*

I like the look of these statues for some reason. Seems to convey that he was a likeable guy. :)
posted by ffej at 9:01 PM on May 6, 2008


« Older Oxford Muse   |   Naalagiagvik -- The Place Where You Go to Listen Newer »


This thread has been archived and is closed to new comments