transcendental numbers rumble in the technium
September 20, 2009 8:00 AM   Subscribe

Extropy
How did life arise? What is information? In his recent dispatches from The Technium, Kevin Kelly would say extropy (cf. negentropy & Prigogine). [previously 1|2]
posted by kliuless (70 comments total) 10 users marked this as a favorite
 
From the first link:

When entropy (disorder) increases, it produces "more information" as in more bits.

I stopped reading there. Does Kevin Kelly make a habit of being wrong, or is this a special case?
posted by logicpunk at 8:32 AM on September 20, 2009 [5 favorites]


One study estimated the earth harbored 10^30 single-cell microbes. A typical microbe, like a yeast, produces one one-bit mutation per generation, which means one bit of unique information for every organism alive.
Only if there are at least 10^30 bits of information in the typical organism's DNA (and even then only possibly), but the three billion base pairs of the human genome is essentially zero percent of 10^30.
posted by Flunkie at 8:51 AM on September 20, 2009 [1 favorite]


It would behoove this guy to have a basic understanding of information theory before he starts inventing terminology and making claims that grandiose. If he has even looked at Shannon and Weaver, he clearly did not understand it.
posted by idiopath at 9:03 AM on September 20, 2009 [4 favorites]


Wow -- how appropriate; we just had this metatalk and related fpp regarding when you can tell if someone is faking specialized knowledge, and here's an example of a giant plate of gibberish by someone who clearly has no idea what he's talking about. Well played!
posted by Frobenius Twist at 9:06 AM on September 20, 2009 [5 favorites]


He also has some really perverse ideas about complexity. Science does not have a hardon for computers because they are complex, but rather because they are simple.
posted by idiopath at 9:09 AM on September 20, 2009


OK, now I've finished it. That's a whole lotta handwavin'.

Besides, most of this ground has already been covered.
posted by Flunkie at 9:17 AM on September 20, 2009


I knew that Kevin Kelly was always a bit woo-woo handwavey, but at the same time he seemed to have his feet firmly planted insofar as he has a house and a family and a job and a regulary pundit's gig.

That was like timecube with a coat of fresh paint, and the pure, distilled essence of all of Wired's magazine's techno-optimism coursing through it.
posted by fatbird at 9:19 AM on September 20, 2009


My favorite pseudo-meaningful ranting is Francis E. Dec.
posted by idiopath at 9:22 AM on September 20, 2009 [1 favorite]


I see that several people here are, to some degree at least, familiar with Kevin Kelly. I am not; this is the first I remember noticing him. I poked around on his Wikipedia page a little bit, and noticed it saying that he is responsible for the following "law", called by him "the Maes-Garreau Law":
Most favorable predictions about future technology will fall within the Maes-Garreau Point
With "the Maes-Garreau Point" defined as:
The latest possible date a prediction can come true and still remain in the lifetime of the person making it
Can someone please explain to me what this is supposed to mean?

Because it sure seems like it means "Over half of all predictions about technological improvements will come true within the lifetime of the person making the prediction".
posted by Flunkie at 9:28 AM on September 20, 2009


I suppose it would really ruin this guy's day for someone to remind him what a vanishingly small bit of fluff the Earth is compared to the Sun, much less the entire universe.
posted by localroger at 9:30 AM on September 20, 2009


Or, more specifically, "within the lifetime of the person making the prediction or before whatever point in time they arbitrarily specify while making the prediction, whichever comes first".
posted by Flunkie at 9:31 AM on September 20, 2009


Or, wait, maybe it's not saying the predictions will come true. It's saying the timeframe imposed by the predictor upon the prediction will come within the lifetime of the predictor?

As in, I might predict flying cars by the year 2030, but I'm not going to predict flying cars by the year 2230?
posted by Flunkie at 9:35 AM on September 20, 2009


It'll be interesting to see if something like this grows into a techno-libertarian religion in a hundred years or so.
posted by StrikeTheViol at 9:41 AM on September 20, 2009


Can someone please explain to me what this is supposed to mean?

It means transhumanism is a fantasy of wish-fulfillment.
posted by 0xdeadc0de at 9:46 AM on September 20, 2009 [4 favorites]


Oh, Transhumanism is definitely already a religion. It is better documented than most fringe startup cults because the believers tend to be the kind of people who know how to put up a web page. You would be amazed how many very logically minded programmers think that they will attain immortality by uploading their consciousness onto a computer. For certain values of "immortality", they may be correct, but they sure better have a better case than what this guy argues, because he is talking out his ass.
posted by idiopath at 9:50 AM on September 20, 2009


The entry where he explains the Technium project is strange.

He talks about how he lived for many years without using any technology --- there's some line about how everything he reached for was made of wood, stone, or fiber --- and he boasts about how he still doesn't own a TV and keeps his distance from technology despite building his whole career around reviewing tech gadgets.

That, to me, is like saying, "I write about sex for a living but I don't actually have sex. For many years of adulthood I never got laid."

His "distance" from the thing he writes about does not give him any special authority. It appears that he attempts to seem Serious by posing as an ascetic.
posted by jayder at 9:58 AM on September 20, 2009 [1 favorite]


because he is talking out his ass.

Well then, if he can upload his ass he should be all set.
posted by joe lisboa at 9:59 AM on September 20, 2009 [4 favorites]


I see that several people here are, to some degree at least, familiar with Kevin Kelly.

17555.
posted by You Should See the Other Guy at 10:00 AM on September 20, 2009 [1 favorite]


joe lisboa: "if he can upload his ass"

I think that topic pertains to another front page thread.
posted by idiopath at 10:04 AM on September 20, 2009 [1 favorite]


I think I'll just stick to his "cool tools" blog.
posted by qwip at 10:08 AM on September 20, 2009


How did life arise?

The front page of MeFi contains the answer, right below this post:

It started as a simple term project for an MIT class
posted by ricochet biscuit at 10:31 AM on September 20, 2009


I have been reading The Technium since the beginning.

Kevin Kelly has read more books and met more scientists than all of us combined.
His curiosity is insatiable.
He is playing with much more data that I would go through during three lifetimes.
Kevin Kelly is in a quest for order in chaos and he reports what he has found: his endeavor is daring, courageous and enlightening.

There are errors, there are pitfalls, but this is a tremendous enterprise.
He is trying to have (for himself) an overview of our state of knowledge encompassing all sciences. And he is sharing it.

I have no doubt that The Technium is one of the defining works of the beginning of the 21st century.

And he is a good storyteller.
I don't agree with everything.
Agreement is not a necessity for enjoying and learning.

(As a note: being awed by a process is not the same thing as religion or even optimism.)
posted by bru at 10:37 AM on September 20, 2009 [2 favorites]


OK, why has no one told me of Francis E Dec before? (Or, Wow, it sure is nice out here without that rock covering me!)

If a Markovian Dec rant generator produces text that is indistinguishable from real Dec rants, does that count as uploading Dec's consciousness to a computer? This may be a living transhumanist's best hope for immortality; low-complexity babble-stream in a particular writing style is easy by current standards. We even have the tools to do the "transfer" of consciousness, and the transfer process even leaves the original transhumanist intact. Double your blog output! The biological transhumanist can, if desired, give orders to his silicon twin to abuse a technical term of choice if the blog is going to have a single focus. It's probably even possible to create a module for creating nonsense images that pose as graphs.

These can't be new ideas, surely a WordPress plugin already exists? And perhaps a SiliconAudience plugin for generating comments as well?
posted by Llama-Lime at 10:38 AM on September 20, 2009


Can someone please explain to me what this is supposed to mean?

I think it means that a prediction about something that might happen before you die is likely to be better than one about something hundreds of years off.
posted by InfidelZombie at 10:39 AM on September 20, 2009


bru: "Kevin Kelly has read more books and met more scientists than all of us combined"

And he talks about things beyond his level of understanding while apparently unaware of his level of ignorance. Many of the things he talks about are not things I know much about, but I know enough about information theory and cybernetics to know that when he talks about information and emergent behavior he has some severe misapprehensions about these subjects.

Which is fine. As we are learning we misunderstand things and talking it out is part of learning.

Except for the fact that he seems to be making novel claims based on these misapprehensions, and these claims are the subject we are talking about. So the fact that the claims are on extremely shaky ground is directly apropos.
posted by idiopath at 10:58 AM on September 20, 2009 [2 favorites]


Kevin Kelly has read more books and met more scientists than all of us combined
Let's say the average MeFite is, I don't know, thirty years old, and make a very conservative guess of having read two books per year. Then Kevin Kelly has read, on average, about one book for every five minutes of his life, including while he was sleeping.

You should flesh this claim out to a 1500 word article, and get it published in Wired.
posted by Flunkie at 11:07 AM on September 20, 2009 [6 favorites]


The guy certainly seems to like coining gratingly ugly new terms, or nauseologisms if you will.
posted by TheophileEscargot at 11:16 AM on September 20, 2009 [5 favorites]


Kevin Kelly has read more books and met more scientists than all of us combined.

All of who? Metafilter? This thread? His naysayers? Either way, this statement is beyond ludicrous and had me stop reading your post after the statement the way others stopped reading his article when they came across something equally idiotic.
posted by You Should See the Other Guy at 11:29 AM on September 20, 2009


Agreement is not a necessity for enjoying and learning.

Okay, here's the thing. If he were writing a fictional piece, there would be truth to this: perhaps I could disagree with the scientific ideas he's putting forward, but still enjoy some of the creativity he's showing. But he's not writing a piece of fiction: he's putting forward scientific claims, and therefore he's subject to a much more rigorous set of rules than just whether or not we agree.

Okay, you know what? It's Sunday afternoon, I've got some time, and right off the bat the garbage he spews in the first link made me see red with anger, so screw it: let's just focus on one -- one -- paragraph from that link. Here is is:

The rules behind the fundamental behavior of the elemental particles and energies that make up our reality are very spare, almost naked. It might take books and books to explain them in words, but the laws themselves can be compressed into a very small amount of information. If you were to take all the known laws of physics, formulas such as f=ma, E=mc^2, S= K log W, and more complicated ones that describe how liquids flow, or objects spin, or electrons jump, and write them all down in one file, they would fit onto a single gigabyte CD disk. Amazingly, one plastic plate could contain the operating code for the entire universe. Even if we currently know only 0.1% of the actual number of laws guiding universal processes, many of which we are undoubtedly still unaware of, and the ultimate file of physical laws was 1,000 times bigger, it would fit onto one high-density "disk" in a few years from now.

I'll take this point by point.

1) He appears to be claiming* that the laws of physics are equivalent to the universe -- indeed, he calls them the "operating code." Without even getting into the can o' worms regarding whether or not there will ever be a finite list of such laws, let me just say that this is an absurd position to take. Indeed, physicists only model the world; a physical model is only an approximation to reality. It is absolutely not the same as reality itself -- claiming that it is, is equivalent to claiming that you will always get perfect, completely error-free answers from any physical model you run. Which is, of course, absurd.

2) Continuing in this vein, he appears to be endorsing a billiard-ball model of the universe arising from Newtonian dynamics. But the rise of quantum theory showed that this is absolutely not how the universe behaves: there is randomness inherently build into the very structure of spacetime. Hence even a computer running a full set of the "laws of physics" would only predict one of many possible future universes, based on a set of initial parameters.

3) But wait! There's more! Physics is based on a series of physical models. An important aspect of physical models (such as the standard model of particle physics, Lagrangian mechanics, special relativity, etc.) is that they all require inputs. That is, a model by itself is like a computer with no programs to run: it says nothing until it has an input. Thus, what Kelly is saying is equivalent to the claim that since the information of your operating system can be held on a DVD or two, so can all of the information of all programs that can be run on your computer. This is obviously wrong.

I.e., even if I were to grant him that the "laws of physics" could accurately and perfectly model the universe, he would still be disastrously wrong on this point, because he's forgetting the importance of the input parameters. And precisely how complex is that set of input parameters? That's right -- the set of input parameters is exactly the whole universe itself. I would be quite surprised if that could fit on a CD.

*I wrote "appears to be" for a reason: Unlike someone who might actually be writing a scientific paper, Kelly doesn't define any of his terms, and it is extraordinarily difficult to actually parse what he's saying. Oh, wait: look at this --

We can not make an exact informational definition of extropy . . . .

. . . oh. OH. So you wrote all of that, and there isn't even a definition? I see.

posted by Frobenius Twist at 11:41 AM on September 20, 2009 [5 favorites]


And fuck it! One more thing:

Measured by the amount of digital storage in use, the technium today contains 487 exabytes (10^20) of information, many orders smaller than nature's total, but growing.

Astounding. Even though we don't know quite what the technium is, we can precisely measure how much information it contains! Wait, what's that you say, Kelly?

Until we clarify our language the term information is more metaphor than anything else.

Wow! The technium contains 487 exabytes of metaphor! That sure is a lot of metaphor!
posted by Frobenius Twist at 11:49 AM on September 20, 2009 [4 favorites]


"Even counting vast tracks of agriculture, the technium entails fewer than one percent of the atoms on the Earth's land surface."

Atoms on the earth's land surface? What exactly does that mean? The very top layer of single atoms? Plants, but only the parts above ground? Snow, or just the top layer of atoms in the water molecules that make up the snow? If a bull shits, when does the bullshit stop resting on the earth's surface and become a part of the surface?
posted by longsleeves at 12:00 PM on September 20, 2009


I think I'll just stick to his "cool tools" blog.

Kevin Kelly never really wrote it, so much as edited submissions, and he hasn't even done that for at least three years — he has an ever-changing cast of other people doing that hard couple hours a week of work for him.
posted by blasdelf at 12:13 PM on September 20, 2009


Kevin Kelly has read more books and met more scientists than all of us combined

Saying that he knows more about the subject than we can possibly imagine will probably not earn his embattled viewpoint much succor here.
posted by Blazecock Pileon at 12:24 PM on September 20, 2009 [2 favorites]


I was suspicious before I read the FPP, so I looked up the wikipedia entry on "extropy" first. Suspicions mounting, I read the (part of) the main link. There my suspicions were confirmed by the author's steadfast refusal to define any terms whatsoever, immediately followed by claims already torn apart in this thread.

I now feel confident to say that there is no need whatsoever for a term that means "vaguely but not specifically the opposite of entropy". A whole lot of people would benefit from learning what entropy is.

I mostly agree with the nastier commenters above, but I wanted to make a slight disagreement <derail> with Frobenius Twist. Quantum mechanics doesn't automatically have randomness built in. The many-worlds interpretation does not have wave-form collapse and therefore there is no randomness. So if you're willing to think of quantum fields as the ultimate description of reality, computers simulating the appropriate wave equations could chug out every possible universe. Well, actually they couldn't, it's easy to prove that no possible computer could perform enough operations or house enough information. But still... </derail>
posted by Humanzee at 12:27 PM on September 20, 2009


Only if there are at least 10^30 bits of information in the typical organism's DNA (and even then only possibly), but the three billion base pairs of the human genome is essentially zero percent of 10^30.
posted by Flunkie at 8:51 AM on September 20 [1 favorite]


But, the calculation we need is not how many base pairs, but how many possibly unique arrangements. This then gives 4 bases ^3 billion, which is a MUCH larger number than 10^30. You only need 50 base pairs to give 10^30.

However his use of yeast as a typical microbe is pretty dubious - it has about twice the genome size of a typical bacteria, which make up the vast majority of life. The mutation rate of yeast is also about 0.003 per replication, not 1. Bacteria are even lower, E.coli is 0.0002.
posted by scodger at 12:40 PM on September 20, 2009


But, the calculation we need is not how many base pairs, but how many possibly unique arrangements.
No. This is completely incorrect. That's not "bits".

And before you argue that he might be using "bit" to mean "piece", he was quite clearly using it in the one-or-zero sense, not just in this specifically, but throughout the whole article. In fact, he explicitly was trying to make points based on it.
posted by Flunkie at 2:15 PM on September 20, 2009


Quantum mechanics doesn't automatically have randomness built in. The many-worlds interpretation does not have wave-form collapse and therefore there is no randomness.

True, that's a good point. I think my main point still stands, though, because a computer (quantum or not) will never predict exactly the future of this universe, but rather all of the many possible futures.
posted by Frobenius Twist at 2:18 PM on September 20, 2009


My brief summary of this thread: MeFites own MeFi's own kk.
posted by grouse at 2:49 PM on September 20, 2009


Quantum mechanics doesn't automatically have randomness built in. The many-worlds interpretation does not have wave-form collapse and therefore there is no randomness.

Not strictly on topic, but I've always wondered: If many-worlds is true, how come I always end up in the shitty, depressing universe?
posted by Justinian at 3:12 PM on September 20, 2009 [1 favorite]


"Kevin Kelly has read more books and met more scientists than all of us combined.
His curiosity is insatiable.
He is playing with much more data that I would go through during three lifetimes.
Kevin Kelly is in a quest for order in chaos and he reports what he has found: his endeavor is daring, courageous and enlighten"
And that makes him right?
Seriously, why did he not listen when all those scientists told him he was spouting bollocks?
posted by edd at 3:16 PM on September 20, 2009


Kevin Kelly's tears can cure cancer. Except he has never cried.
posted by Justinian at 3:34 PM on September 20, 2009


"When entropy (disorder) increases, it produces "more information" as in more bits."

I stopped reading there. Does Kevin Kelly make a habit of being wrong, or is this a special case?

I think he meant that when a system has any order to it (as opposed to complete randomness) it takes fewer bits to describe due to patterns which can be compressed. A complete random system would need to be described one-to-one with no compression of data.

The question in the post What is information? got me excited and I clicked on the second link. Maybe I missed it, but where is the question answered?

The idea of information being a fundamental force like gravity caught my attention years ago and I haven't run across anything since. It was (if I remember correctly) proposed by Wojciech Zurek. I'm a complete layman and some of you seem to really know what you're talking about. I find the idea fascinating and if any of you know what I'm refering to and can point me in the right direction, I'd be grateful.
posted by Mike Buechel at 3:37 PM on September 20, 2009


Maybe I missed it, but where is the question answered?

Found it. Sorry, I missed that first link.
posted by Mike Buechel at 3:44 PM on September 20, 2009


if any of you know what I'm refering to and can point me in the right direction, I'd be grateful.

http://en.wikipedia.org/wiki/Information
http://en.wikipedia.org/wiki/Information_theory


is where I would start.
posted by Ndwright at 3:53 PM on September 20, 2009


I think he meant that when a system has any order to it (as opposed to complete randomness) it takes fewer bits to describe due to patterns which can be compressed. A complete random system would need to be described one-to-one with no compression of data.

I'll agree that might be what he meant, but the fact that you need more bits to describe a system as entropy increases doesn't mean that increasing entropy magically produces more information. When he goes on to say that decreasing entropy also increases information, it's pretty clear that he's either willfully equivocating on the point, or he's just out of his depth.
posted by logicpunk at 4:01 PM on September 20, 2009


Information is a meaningless concept unless you are talking about a particular message or signal. In DNA the message is "how to recreate parts of this organism, and how to start building a new one", the recipient is the organism carrying it, sometime in the future. In the context of a computer browsing the Internet, it is the stream of bits representing the data you are accessing from the Internet. If you randomly change pieces of the signal, you lose information.

If I am remembering this stuff correctly, "Increasing information", as a concept, is absurd in the context of information theory. You could get the wrong bits and accidentally get a pretty picture, or get the wrong DNA and accidentally get a beneficial mutation, but this very quickly leaves the domain of information theory and enters biology or aesthetics. Until you want to reproduce or convey that signal, then you are talking information theory again.
posted by idiopath at 4:12 PM on September 20, 2009


The idea of information being a fundamental force like gravity caught my attention years ago and I haven't run across anything since. It was (if I remember correctly) proposed by Wojciech Zurek.

Here's how I know I'm an idiot: this immediately reminded me, not of an actual scientific or philosophical concept or anything, but of the Hylaean Flow from Anathem.
posted by synaesthetichaze at 4:22 PM on September 20, 2009


A little more clarification: you can increase the capacity for information of a particular mechanism of transmission, which gets a signal through faster (through some combination of increased bandwidth and improved clarity / detection of the signal). But information is not a thing, it is a quality of the medium of storage or transmission (that is you can't point to a bit on a hard drive or in a CPU, it is not an object but a state of that object), and information theory measures the predictability, speed, or efficacy of that behavior of transmitting information. But it has nothing to do with the content. Something is by definition information if you feed it into the input of the transmitter.
posted by idiopath at 4:23 PM on September 20, 2009 [1 favorite]


OK, I dug it up:

"Most of us are used to thinking of information as secondary, not fundamental, something that is made from matter and energy. Whether we are thinking of petroglyphs carved in a cliff or the electromagnetic waves beaming from the transmitters on Sandia rest, information seems like an artifact, a human invention. We impose pattern on matter and energy and use it to signal our fellow humans. Though information is used to describe the universe, it is not commonly thought of as being part of the universe itself. But to many of those at the Sante Fe conference, the world just didn't make sense unless information was admitted into the pantheon, on an equal footing with mass and energy. A few went so far as to argue that information may be the most fundamental of all; that mass and energy could somehow be derived from information."

This is from George Johnson's Fire in the Mind. He was writing about Zurek's presentation in 1989 at a conference at the Santa Fe Institute.

I haven't read what was presented at the conference, only Johnson's take on it. I think the way most people use the word "information" isn't what (I'm guessing) Zurek means. Does anyone know if the presentations at the conference were published?
posted by Mike Buechel at 4:55 PM on September 20, 2009


One thing that I read several years back that struck me:
Information is something that only exists in context. If I presented you with a short piece of writing in an unknown language with an unknown alphabet, it would effectively contain zero information. You would never be able to understand it without connecting it to something else.

The information in DNA exists in the context of DNA's role in guiding the construction and operation of cells. Specifically, the cell itself provides the "interpretation" via transcription. Thus outside of its host cell (or one sufficiently similar), DNA contains no information, just like a book written in an unknown language. Of course, the host cell itself lives within some ecological niche, which prevents the accumulation of arbitrary genetic "instructions", and though this connection, the DNA/transcription machinery describes the host cell's ecological niche. For this reason, absent major ecological changes, yeast cells 1 million generations from now will be the same as yeast cells today.

The point being, we have in DNA at least two perfectly good descriptions of its information content (although the transcription can in principle be quantified, whereas I suspect the "ecological description" cannot). Talking about bits without specifying context is pointless.

On preview:
My feeling is that the Santa Fe Institute houses some rather woolly types (I'm a former complex systems guy myself, if that means anything). But, I like to be helpful occasionally:
Maybe you want this.
posted by Humanzee at 5:05 PM on September 20, 2009


Talking about information without talking about the systems which transmit, convey, and retrieve it, seems to me to be about as meaningful as talking about energy without talking about the masses that store or release the energy.

Fire in the Mind is not a scientific work, it is a metaphysical one.

I don't know enough about what Zurek is saying to weigh in one way or another, but he talks about the kinds of stuff that wiseacres and people who overestimate their intelligence just love to death (alongside string theory and quantum mechanics).
posted by idiopath at 5:11 PM on September 20, 2009


When you open your mouth with your brain in neutral, that is what falls out.
posted by TomStampy at 5:15 PM on September 20, 2009


What is Life? By Erwin Schrödinger
posted by kuatto at 5:31 PM on September 20, 2009 [1 favorite]


Thanks, Humanzee. That might be it! I'm curious: Why do you feel "the Santa Fe Institute houses some rather woolly types"?

I don't know enough about what Zurek is saying to weigh in one way or another

I don't either.

Fire in the Mind is not a scientific work, it is a metaphysical one.

True, but Johnson's writing introduces ideas to non-scientific people who aren't able to understand some of these things without an intermediary. People like me!
posted by Mike Buechel at 5:36 PM on September 20, 2009


Information is something that only exists in context. If I presented you with a short piece of writing in an unknown language with an unknown alphabet, it would effectively contain zero information. You would never be able to understand it without connecting it to something else.

It's my understanding that this is false. The information is still stored in the language, whether or not you can decode it. So it's incorrect to say that it contains no information, it's just that the information is inaccessible. I realize that this is almost the same thing as what the quoted sentence says, but I do take issue with how the quote is framed. The quote makes it seems that information is not present at all until the source is brought into the appropriate context. More accurately, the information does exist, it just requires appropriate decoding.

I thought there were some formal definitions of information and that it had some mathematical backing. I'll have to hunt around and see if I'm off my rocker or not.
posted by forforf at 6:13 PM on September 20, 2009


Re Sante Fe Institute: I'll be as brief as I can (sorry, that's not brief). First I'll say that they have some first-class people there, and I'm not saying any of them are frauds/fakers.

I also have to say that my graduate research group (a complex systems group) was engaged in a battle with some of these guys. So that's my background, take it for what that's worth. Our biggest topic of contention was Self-Organized Criticality (also: Wiki).

Basically, we didn't think that any real systems were SOC (not even sand piles!), that certain physicists had used poor statistical work and had attached too much meaning to the existence of power law statistics in certain data (they vaguely mention that in the Santa Fe SOC link). SOC stems from the success of critical phenomena in describing the phase transitions of matter. Fresh off this success, physicists began misapplying it to other systems. SOC has profound philosophical implications that I think are ridiculous. e.g. SOC applied to neuroscience would say that neural properties are pretty much irrelevant, the system can be perfectly understood by sampling the activity of parts of the brain and cataloging "event" statistics ---an idea that is simply false. It also implies that the nervous system is fractal, which is also simply false. So it goes for pretty much every SOC application I've seen.

More deeply, there are two kinds of philosophical approaches to complex systems:
1. It's a collection of math, techniques, and approaches commonly used in certain scientific problems, which when informed by domain-specific knowledge can be useful.
2. It's a field of study unto itself, and a complex systems researcher could make real breakthroughs in say, geology or biology with little or no knowledge about those fields. People often talk about "non-equilibrium" statistical mechanics in this context.

I am firmly in camp 1, Santa Fe has lots of people in camp 2. Approach 2 says "details don't matter, I'll figure out every problem at once". It has been amazingly and unreasonably successful in the past, but I think it's day has passed. I deliberately moved out of the complex systems field for this reason.
posted by Humanzee at 6:21 PM on September 20, 2009 [4 favorites]


forforf: "I thought there were some formal definitions of information"

Information theory treats white noise at the input the same way it treats the human genome: in terms of how effectively and accurately it comes out the other end untransformed. In other words, the fact that it is the message, makes it information.

Information theory is not about what information is, but rather, given that we already know that this is the information, what qualities does it have?

In practice, much of information theory is how you go about increasing redundancy for accuracy, and reducing redundancy smaller size.

You can talk about the amount of information in a signal by talking about how much it reduces uncertainty about the content of the message. But this only addresses statistical qualities of information, not what makes something information as opposed to non-information.
posted by idiopath at 6:29 PM on September 20, 2009 [1 favorite]




forforf, you're not off your rocker, I've heard similar statements and they're not unreasonable. If you just wanted to try to compress the unknown text, you could do so. You could analyze it for certain kinds of correlations in letter placement, and maybe convince yourself that it wasn't a stream of random characters (although it would be easy to fake those stats). So you're providing the contex: text compression, or letter-correlations. Those can be quantified rigorously.

If I was to go through your favorite novel and alter some of the spellings (especially of longer words) to no longer conform to english norms (say I changed all the q's to k's) you would not lose any information because you could use your understanding to correct the mistakes. The text compression properties might very well change, and the letter correlations would change quite a bit. So depending on the context you're using, you could say I'd altered the information in the book, or just as well say that it was unchanged. Most of the time, especially when people talk about biological information, they mean something much more like reading a book. After all, if I alter one of your genetic base-pairs and there is no resulting phenotypic change, who cares? Have I really altered the information in your genome?
posted by Humanzee at 6:44 PM on September 20, 2009


Then there's Bateson's definition of information: "a difference that makes a difference."
posted by psyche7 at 6:54 PM on September 20, 2009 [1 favorite]


Heh, you text-compressed my two paragraphs down to one sentence.
posted by Humanzee at 7:25 PM on September 20, 2009




I thought there were some formal definitions of information and that it had some mathematical backing. I'll have to hunt around and see if I'm off my rocker or not.

You might be thinking of physical information, which is what "information" means in the context of someone saying, for example, "information is lost in black holes."
posted by invitapriore at 10:06 PM on September 20, 2009


I should mention that Stephen Hawking, who originally made the hypothesis that information is lost in black holes, has changed his mind.
posted by invitapriore at 10:10 PM on September 20, 2009


a copula points: 1) one of the things i find sorta interesting about extropy/transhumanism is its parallels with intelligent design. altho i'm pretty sure an extropian transhumanist would be loth to be associated with discovery institute IDers, it's been said before that what can be ascribed to singularity has many aspects of rapture (for nerds ;) so much so that even coming from opposite ends of the spectrum, they're practically on the same wavelength (but in antiphase!) so on a related point 2) i wouldn't read the technium as science per se -- to explain -- as much as philosophy -- to impart meaning -- if that makes any sense... like with ID, it seems as if extropy takes the contours of what isn't (can't be?) known or explained and imbues it/them with admittedly hand-wavy (alchemical, epicyclic) 'forces' to make the inexplicable, not rigorously understood, but at least placed in some kind of human-scale context.

also i think by trying to delineate (if not actually characterise) lacunae, at its best, informs science. like dark energy, for example, wtf? experience has run ahead of theory. from the other direction, string theory (or other ilk of its TOE) and the multiverse are accused of just-so metaphysics. the anthropic principle is trivial YET, being self-evident, is still inescapably, ineluctably, given. we exist and cannot be explained away; kk is right to ask -- along one of many lines of inquiry -- what is life? consciousness? complexity? probability?* computation? information? the universe, at least (very) locally, appears to be ordering itself... why is that? while there are suppositions, but no definitive or satisfying answers, 'extropy' -- his dark materials -- seems to be a useful rejoinder/placeholder as any to try and put it all in perspective. like before 'god' was fractured (and put back together again?) [oh and math is not science (nor does it have a nobel) depending on how you define 'reality'.]

to back up bru then, i find kk's attempts to bring it all together and connect the dots -- even when it's not readily apparent that one can or should -- admirable within the framework of established facts of course! now i can't speak for kk, but where he doesn't get his facts straight, i don't think he'd disagree that his credibility, and the credulity of the audience, suffers -- n.b. prigogine, or wolfram for that matter** -- as evidenced in this thread :P

---
*btw in the determinism/free will debate it looks like its 't hooft vs. zurek

**for every feynman and sagan... well, i guess more like gleick and now apparently the gladwell backlash has reached a tipping point... were that there were better (philosophy of) science and technology writers out there; i can think of a few, off the top of my head: steven johnson, jonah lehrer, clay shirky...
posted by kliuless at 11:11 PM on September 20, 2009


kk is right to ask -- along one of many lines of inquiry -- what is life? consciousness? complexity? probability?* computation? information?

Yeah, kk is right to ask - those are all enthralling questions, and lordy wouldn't we all like to know. But when you don the protective goggles and white lab coat of the hard-ass scientist and throw around the language of hard-ass science, it's fucking incumbent on hard-ass scientists to make certain sure you are for real. kk is getting the treatment here not for asking questions, but because he's using words he doesn't understand to make claims he can't back up. He's not, in short, legitimately asking questions - he's peddling whatever his half-assed theosophy is by co-opting scientific terms to give it the appearance of being legitimate. I don't find this admirable in the slightest.
posted by logicpunk at 12:05 AM on September 21, 2009 [1 favorite]


Humanzee: I think you're correct about interpreting 'context' in a formal way. But (and this is admittedly tangential to your point) I would think that when we ask whether information has been lost simpliciter, we want to know simply whether there exists a context in which information has been lost. We may not yet have a context in mind. So, continuing your example, since there is at least one context in which some of the information contained in the novel/message would be missing under the transformation you describe,* it follows that information has been lost.

The reason this is important, incidentally, is that it's a case of scientists (well, mathematicians, really) running into a sort of confusion because a metaphysical question hasn't been answered. In this case, some of us are very comfortable talking about "the information" as a thing that can be locked away inside a coded message, even if nobody ever does or even could decode it. On this view, information has the same sort of metaphysical status as a mind-independent physical object. Others think that makes no sense, since a message always implies both a sender and a receiver (be they man or machine). The information isn't some sort of substance unto itself, but is only a measure of some difference being transmitted from point A to point B. A few big-wigs in quantum physics like Zurek seem to promote the first view of information; at least I think they must if they take it to be a fundamental physical property. Shannon, by contrast, clearly had something like the second view in mind.

Long story short, I don't think this is just a question of different but equally good points of view; it's a question that goes to the heart of some of the current attempts to explain quantum theory.


* For suppose the number of misspellings in the novel/message had some significance, as it might if the novel were also functioning as a surreptitious but extravagant means of communication between spies. Then the transformation might change the number of bits of information contained in the message.
posted by voltairemodern at 12:15 AM on September 21, 2009


I love Transhumanism for the same reason I love Gnostic Christianity. It's wrong, but but it's beautifully wrong.
posted by empath at 12:32 AM on September 21, 2009 [5 favorites]


You're right invitapriore. I was thinking more along the lines of physical information, since as a former communications engineer it was what 'information' meant to me. Or as the wikipedia article you linked to said (for physical information):
Information itself may be loosely defined as "that which can distinguish one thing from another".

The other type of information, the kind that involves the transmission of ideas and concepts, is a bit more abstract. However, it is not as mysterious as the linked article makes it sound. Conceptual information is the mapping of ideas and concepts to physical information representations. This mapping requires the receiver not only to be able to decode the physical information, but the receiver must be in sync with the sender as to what ideas and concepts map to which physical information symbols. So in that regard, Humanzee's unknown language example was correct, since the sender and receiver of the language text would have been out of sync. So no conceptual information transfer would take place, however, the information is not lost, just unable to be accessed. To nitpick, I would agree with this rewording of the example:
Information transfer is something that only exists in context. If I presented you with a short piece of writing in an unknown language with an unknown alphabet, it would effectively contain transfer zero information. You would never be able to understand it without connecting it to something else.
So in short, thanks to this FPP I'm now able to remember AND achieve a better understanding of some of the basics of information theory, while confirming that the author of the 'extropy' piece was completely out of his league in his musings/conclusions.
posted by forforf at 6:34 AM on September 21, 2009 [1 favorite]


This piece reminds me strongly of that "What the Bleep Do We Know?" drivel that was floating around a year or three ago. Dressed up in the clothes of someone who wants to understand, but espousing complete nonsense. Although, it's probably better for you that way - the closer humans get to understanding how the universe really works, the more often they go completely gibberingly insane.

Iä Iä
posted by FatherDagon at 9:27 AM on September 21, 2009 [1 favorite]


« Older Facebook outing   |   [Note to all my MeFi friends: your post is one of... Newer »


This thread has been archived and is closed to new comments