The Algorithm: Idiom of Modern Science
January 19, 2008 9:19 PM   Subscribe

The Algorithm: Idiom of Modern Science - an allegory told with iPods as Universal Machines.
posted by loquacious (42 comments total) 6 users marked this as a favorite
 
Uh oh. Me broke Princeton's website?
posted by uaudio at 9:25 PM on January 19, 2008


"Not to stretch the metaphor past its snapping point, the temptation is there for the Algorithmistas (my tribe) to fancy themselves as the Knights of the Round Table and look down on Moore's Law as the Killer Rabbit, viciously elbowing King Arthur's intrepid algorithmic warriors. Just as an abundance of cheap oil has delayed the emergence of smart energy alternatives, Moore's Law has kept algorithms off center stage. Paradoxically, it has also been their enabler: the killer bunny turned sacrificial rabbit who sets the track champion on a world record pace, only to fade into oblivion once the trophy has been handed out."

I'm having a hard time even reading this essay. Tell me not to be ashamed, Metafilter.
posted by jinjo at 9:39 PM on January 19, 2008


I really don't think Bohr said that its hard to predict the future. I've always heard it attributed to Storm P.
posted by AwkwardPause at 9:40 PM on January 19, 2008


Not to stretch the metaphor past its snapping point, the temptation is there for the Algorithmistas (my tribe) to fancy themselves as the Knights of the Round Table and look down on Moore's Law as the Killer Rabbit, viciously elbowing King Arthur's intrepid algorithmic warriors. Just as an abundance of cheap oil has delayed the emergence of smart energy alternatives, Moore's Law has kept algorithms off center stage.

This part means that since processing power (oil) has been so cheap in recent times (Moore's Law), there has been less interest in algorithms (alternative energy sources).

Paradoxically, it has also been their enabler: the killer bunny turned sacrificial rabbit who sets the track champion on a world record pace, only to fade into oblivion once the trophy has been handed out.

This part I would probably have to read the article to understand, but I find this paragraph so off-putting in its obtuse over-use of analogies, that I don't think I'll be doing that. Also, the website won't load for me.
posted by !Jim at 9:44 PM on January 19, 2008


This is an incredibly poorly written article - it's clear as mud to those who don't already know what he's talking about while garrulous and obvious to those who do.
posted by Ryvar at 9:55 PM on January 19, 2008 [3 favorites]


Coral cache link.
posted by loquacious at 9:57 PM on January 19, 2008


that was painful, i think he's trying to write in the style of Douglas Hofstadter's GEB
posted by bhnyc at 10:02 PM on January 19, 2008 [1 favorite]


Holy god, I couldn't even get past the 5th paragraph. Has Chazelle always written stuff like this? I've read a couple of his papers, and they were pretty good (and not overflowing with ego or anything), so this was a total surprise.

Man, I'm totally weirded out now. Feels like going over to some seemingly normal acquaintance's house and suddenly discovering his vanity press sci-fi epic on the shelf.
posted by equalpants at 10:09 PM on January 19, 2008 [2 favorites]


bhnyc, I agree. It would be one thing to parody Hofstadter, but attempting to write like him is a disaster for anyone but Hofstadter.
posted by CheeseDigestsAll at 10:14 PM on January 19, 2008


I got about 20% of the way through it, and started skipping. I didn't ever pick up on exactly what his thesis was. It felt to me like I was being fed double-talk.

I'm reknowned for writing long, rambling essays, but that one's got even me topped.
posted by Steven C. Den Beste at 10:41 PM on January 19, 2008 [1 favorite]


That was a confused mess.
posted by empath at 10:45 PM on January 19, 2008


At first I thought I was weird because I'm enjoying the article (I'm about halfway). Then I realized it's probably because I have a CS degree. Or maybe I am weird and I just subscribe to this guy's brand. Either way, thanks loquacious.
posted by systematic at 10:55 PM on January 19, 2008


I just skimmed but it has interesting thoughts, albeit scattered. Also, a more apt metaphor for "the universal computer" might be the iphone (but for how long). The author frequently digresses in order to articulate lay concepts that even even the most feebly-endowed nerd could comprehend.
posted by sswiller at 11:01 PM on January 19, 2008


yeah, i have a EE/CS background... and even then, this was horrible. couldn't get through it.
posted by joeblough at 11:15 PM on January 19, 2008


Not horrible, but it sounds like someone restating much of Hofstrater's GEB.
posted by milestogo at 11:22 PM on January 19, 2008


That was a beautiful article. The secret to enjoying it is to slow down and let each sentence sink in.
posted by king walnut at 11:23 PM on January 19, 2008


I enjoyed it, but it's full of so many references that decoding those while learning new material is a bit too much to ask.

Also, how is this an allegory? He's saying the ipod is a universal machine.


Also, a more apt metaphor for "the universal computer" might be the iphone (but for how long).


Again, the ipod is used because it's surprising to people that what they think of as only a music player is as powerful (expressively) as the most powerful supercomputer in the world. The iphone is much more general purpose and the fact that it can do whatever you ask it to wouldn't be as surprising.
posted by null terminated at 11:54 PM on January 19, 2008


Phil Cubeta meets Richard Feynman.
posted by vapidave at 12:16 AM on January 20, 2008 [1 favorite]


As a fan of GEB and an avid reader of anything treating this area, I think it's terrible. Almost as bad as this book which I implore you NOT to read. Berlinski had a great idea, but wound up just writing to impress his ego.

If anyone knows of a book that has treated the subject of both the history and the philosophy of the algorithm in an approachable manner, please suggest it to me via memail.
posted by zap rowsdower at 12:20 AM on January 20, 2008


** Offer of Turing-equivalence only valid on ∞GB model
posted by Pyry at 12:27 AM on January 20, 2008 [2 favorites]


Just to make sure there's no hard feelings: I like you just fine, loquacious. It's this article you linked to that I hate.
posted by Ryvar at 12:56 AM on January 20, 2008


This style is hilariously obtuse! The sentence that everyone is up in arms about ("the temptation is there for the Algorithmistas [...]") mixes at least four metaphors. It doesn't just mix them, it purees them. That sentence is the worst offender I can find, but there's probably a worse one in there somewhere.

This one is pretty terrible too: "Only when the Algorithm becomes not just a body but a way of thinking, the young sciences of the new century will cease to be the hapless nails that the hammer of old math keeps hitting with maniacal glee."

The author reminds me of the Timecube guy (but educated), or maybe Thomas Friedman (but educated).
posted by painquale at 1:14 AM on January 20, 2008


I think the central failing of this piece (which I enjoyed) is that it's poorly scoped. He tries to cover most of the current state of computer science in, what a half-dozen pages? He spends maybe a page on RSA fercryinoutloud. It's not like physics where you can refer back to familiar metaphors like balls falling on trains to explain relativity, to most people computers are just magic.

In my experience it takes a smart, educated person with no prior experience in higher math or computer science a few hours of patient guided exploration to really grok something like prime factorization, and even then it is vanishingly unlikely that that knowledge will last beyond the confines of the explanation itself.

If I were really motivated I'd rant about the state (or lack thereof) of education, but it's early and I haven't had tea.
posted by Skorgu at 4:33 AM on January 20, 2008 [1 favorite]


Hey, Skorgu, after you've had your tea, would you mind doing the rant? I'd like to read a sorry-state-of-education rant, if you have a good one.
posted by cgc373 at 4:51 AM on January 20, 2008


Yay! It was not just me! I really think I would like to see what is in this person's brain, but he is putting too many things in my way! To couch this in geek metaphors the way he likes it, it's like he's guiding us through the mines of Moria when we are just trying to grab a half at the Prancing Pony.
posted by Deathalicious at 6:11 AM on January 20, 2008


Oh man, and he "begs the question" too. Shame on you!

Oh, can I just say that Metafilter has permanently ruined me. I cannot hear or see "begs the question" without feeling a small pain in my synapses. Boo!
posted by Deathalicious at 6:13 AM on January 20, 2008


Apparently, Gordon Moore skipped that class. In 1965, the co-founder of semiconductor giant Intel announced his celebrated law: Computing power doubles every two years. Moore's Law has, if anything, erred on the conservative side. Every eighteen months, an enigmatic pagan ritual will see white-robed sorcerers silently shuffle into a temple dedicated to the god of cleanliness, and soon reemerge with, on their faces, a triumphant smile and, in their hands, a silicon wafer twice as densely packed as the day before. No commensurate growth in human mental powers has been observed: this has left us scratching our nonexpanding heads, wondering what it is we've done to deserve such luck.
You know, Moore's law is kind of irritating me lately. I mean, why do we have to wait 18 months for faster computers? It's not like the laws of physics change. It's just that's what the economic pressure produces. When AMD really ramped up production on their K7/Athlon chips the average x86 speed doubled quicker then that. But that's tapered off lately.

So in my view the reason semiconductors take 18 months to double in density is simply laziness on the part of the chip companies. They only need to double every 18 months to keep up with the competition, so that's what they do.

On the other hand, it seems like we are approaching the end of the ride, actually. I mean, like I said, chip speed has pretty much topped out, and my understanding is that even smaller feature sizes require lower speeds. The new quad-core AMD chips are clocked lower then their dual-core counterparts.
posted by delmoi at 7:24 AM on January 20, 2008


The author reminds me of the Timecube guy (but educated), or maybe Thomas Friedman (but educated).

Educated stupid.
posted by grobstein at 8:53 AM on January 20, 2008


Twist my arm whydontya.
Full disclosure, I'm not professionally qualified in the realm of education, I'm just some guy. Some guy with tea. Mmmm tea.


One of my school-aged relatives (How's that for vague?) was doing her homework. It involved reading one of those insipid "magazines" and looking up unfamiliar words. Naturally it was a science "article," the rotation and revolution of earth around the sun, appropriate grade school stuff.
[School Aged Relative finishes reading]
SAR: Done! [moves to leave]
Skorgu: Not so fast, what does it say?
SAR: Revolution, rotation, um...
Skorgu: No idea huh?
SAR: Not really
Skorgu: Are you going to go over it in class? Do they talk about it, is there a lesson on this stuff?
SAR: [blank stare]
Cue horribly clichéd training montage of simulating orbital dynamics via small children getting dizzy. Cut to vomit scene.
I fail to see the point of assigning (boring) reading in a subject when you're not going to cover that subject in class. Now the students will associate the subject with boring reading, never take an elective science class in highschool and grow up to be creationists and neo cons. Or something.

And of course there are metric tons of obstacles between even the best teachers and actually imparting knowledge into the malleable lumps of grey matter and spaz placed at their mercies. You couldn't have kids running around spinning like that, think of the (lawsuits|parents|cleanup costs) no matter the value of the lesson or the power of a visceral understanding. Sure, occasionally a truly brilliant spark will find a way to actually sneak some learning past the bureaucracy --and thank god for all of them-- but that's hardly a ringing endorsement of the system.

More to the point, they're teaching the wrong things. Knowing that the earth both rotates and revolves is important, sure, but by itself it gives exactly the wrong impression about science. Science, to students, is facts. Boring facts to be memorized and recited at the appropriate moments. Bubbles with facts to be filled in, essays about facts to be reiterated to the appropriate length.

Science is about understanding. Science is condensing fact from the vapor of nuance1. Science is about being able to look at an arbitrary and capricious reality and penetrate to the core of rationality underneath it all, and that's a lesson that just isn't taught. Of course there is a whole world of "higher" math that's really a prerequisite for this kind of understanding, and that's too hard for the childruun to understand, they might get bad grades.

There is no easy, simplified way to get across the chasm between fact and understanding without climbing down to the streambed of math and wading across.

And here's where my actual point lies. Alton Brown says Food is Love, well here's Skorgu's Corollary: Teaching is Love. If you don't truly care that your student sees the wonder and beauty of the world behind the world that lesson will be lost. Great teachers can do this with whole classes for a while, until they burn out and are reduced to standardized tests. Good teachers can do this with any given kid one at a time, tutoring works.

But all the teachers in the world won't do half as much good as parents. Massively parallel education. Spin your kids around, take 'em digging, buy a telescope. Learn what they're learning and help teach it, better yet learn what they're interested in with them.

Or, you know, just keep giving them worksheets. That works good too.
posted by Skorgu at 8:59 AM on January 20, 2008 [9 favorites]


Thanks, Skorgu. Your tea is mighty! (Not my tea. [NOT MY-TEA-IST.])
posted by cgc373 at 10:15 AM on January 20, 2008


Gah, and it's so simple as well. Computer metaphors need to be about cars, turtles or hares. Fuck one King Arthur.
posted by bonaldi at 12:11 PM on January 20, 2008


Not to stretch the metaphor past its snapping point, the temptation is there for the Algorithmistas (my tribe) to fancy themselves as the Knights of the Round Table and look down on Moore's Law as the Killer Rabbit, viciously elbowing King Arthur's intrepid algorithmic warriors.

Here's an algorithm for you: ("Not to..." X) == ("I am about to seriously..." X).
posted by Laugh_track at 2:30 PM on January 20, 2008


That writing sucked, and that article had so many poorly-defined points I could lay down on it and not injure myself.
posted by davejay at 4:01 PM on January 20, 2008 [2 favorites]


Congratulations, davejay! You have authored the Metaphor of the Week!
posted by Ryvar at 5:53 PM on January 20, 2008


Ugh. This kept getting close to interesting, but it was all metaphor and no meat.

"Wow, that NZ algorithm is sitting down to a hearty dinner!"
"What?"
"The PCP algorithm solves your biscuit making."
"Wait, how?"
"Algorithms are a rabbit. They're all rabbits."

This is an incredibly poorly written article - it's clear as mud to those who don't already know what he's talking about while garrulous and obvious to those who do.

The example there, for me, would be when he references a "two-faced Janus."
posted by klangklangston at 6:43 PM on January 20, 2008


Not to stretch the metaphor past its snapping point, the temptation is there for the Algorithmistas (my tribe) to fancy themselves as the Knights of the Round Table and look down on Moore's Law as the Killer Rabbit, viciously elbowing King Arthur's intrepid algorithmic warriors.

I always thought metaphors didn't feature the terms 'like' or 'as'.
posted by JakeEXTREME at 6:48 PM on January 20, 2008


No. Similes are metaphors.
posted by klangklangston at 7:23 PM on January 20, 2008


Hmmm... metaphors explain something in a way which implies they are one and the same (not literally) where similes use comparision in order to explain something, thus aren't they actually not the same? I mean, why would they bother teaching metaphor versus simile in grade school if they were the same?

Or maybe I'm not getting the joke?
posted by JakeEXTREME at 7:36 PM on January 20, 2008


No, similes are LIKE metaphors.
posted by mccarty.tim at 8:02 PM on January 20, 2008 [1 favorite]


"I mean, why would they bother teaching metaphor versus simile in grade school if they were the same?"

By way of metaphor, not all rectangles are squares.
posted by klangklangston at 9:42 AM on January 21, 2008


Actually, you're both right:

From Wikipedia: According to this definition, then, "You are my sunshine" is a metaphor whereas "Your eyes are like the sun" is a simile. However, some describe similes as simply a specific type of metaphor (see Joseph Kelly's The Seagull Reader (2005), pages 377-379). Most dictionary definitions of both metaphor and simile support the classification of similes as a type of metaphor, and historically it appears the two terms were used essentially as synonyms.
posted by Atom Eyes at 9:05 AM on January 23, 2008


Well, no, Jake started out by carping that the claim of stretching a metaphor was incorrect, as the author was stretching a simile. I pointed out that metaphors are similes. He wondered what the difference was, and I told him (obliquely).
posted by klangklangston at 9:52 AM on January 23, 2008


« Older You Can't Go Wrong With Chimps on a Burro.   |   Picnicface Newer »


This thread has been archived and is closed to new comments