Join 3,561 readers in helping fund MetaFilter (Hide)


A satirical rendering of Transhumanism.
June 4, 2008 2:14 PM   Subscribe

Enough is Enough: A Thinking Ape's Critique of Trans-Simianism.

Dear Colleagues:*
Aubrey de Grey shared with me the essay below which he think I (and other members of the seminar) should read.

If anyone has an idea how a critic of transhumanism could respond to this essay, please let me know. Apparently, this essay is becoming "canonic" in the transhumanist community but no critic of transhumanism has responded to it (apparently, because no one has taken it seriously); Aubrey is asking me to do so. Any advice from you will be appreciated.

Written by Aaron Diaz of Dresden Codak fame.
posted by tybeet (53 comments total) 4 users marked this as a favorite

 
Transhumanism seemed kind of interesting when I first read of it in WiReD back in the first half of the nineties, but what's mostly turned me off the movement is that even though they seem to "dare" everyone to think beyond the current limits of the human body and mind, they're unable to even consider the idea that there might be a better system for society than the most liberal market economics possible.
posted by Joakim Ziegler at 2:27 PM on June 4, 2008


Interesting relevant, I think, is the timeline of evolution.
posted by TheOnlyCoolTim at 2:30 PM on June 4, 2008 [1 favorite]


Interesting relevant, I think, is the timeline of evolution.

See also.
posted by grobstein at 2:33 PM on June 4, 2008 [1 favorite]


I'm a critic of transhumanism only insofar as it doesn't solve any of the actual problems facing us today and effectively gives us carte blanche to treat the environment like shit because one day we're all going to be nodes in a Matrioshka Brain so who gives a fuck? My superficial understanding of it is that it is effectively Rapture 2.0, and while I may be mistaken, I don't care enough to correct myself or be corrected. But I can't criticize this particular piece because, while it is all in English and strictly grammatical, I didn't understand its premise. "Trans-simianism"? I mean, really? Sure, it would be cool to be a Terminator and everything but a system of philosophy, which is all transhumanism can claim to be at this point, ought to be able to provide us with instruction and assist us in living our present lives as morally as possible. Transhumanism does neither which means it fails as a philosophy which turns it into an appendix in a GURPS handbook.

So maybe I'm completely ignorant and misinformed and stupid. So format me.
posted by turgid dahlia at 2:44 PM on June 4, 2008 [6 favorites]


The point of the essay is that humanity, a long long time ago, sort of already did the transhumanism thing at least once - it's more difficult to pinpoint the specific advance or advances, but language seems like a good candidate.
posted by TheOnlyCoolTim at 2:50 PM on June 4, 2008


This essay is a straw man.

Natural evolution happens at a slow enough pace that the morality and philosophy needed to keep society intact can evolve with it. Natural evolution also succeeds only in a direction that is environmentally sustainable.

The type of progress transhumanism promises guarantees neither of those things.
posted by the jam at 2:55 PM on June 4, 2008


This isn't really a "satirical rendering of Transhumanism," which strongly connotes that it's anti-transhumanist when it's quite the opposite. I'd assume that a "satirical rendering of U.S. policies," for example, would not be arguing in favor of U.S. policies.

Just a word choice thing.
posted by lumensimus at 2:59 PM on June 4, 2008


Wait, this is about computers, right?
posted by djgh at 2:59 PM on June 4, 2008


Natural evolution also succeeds only in a direction that is environmentally sustainable.

Untrue or at best you're making some sort of tautology. The environment is a dynamic system and doesn't really "sustain" any state for very long, a lesson learned by overzealous prevention of wildfires, for example. Attempts to keep the environment unchanged like a museum exhibit can be misguided and can do nothing that should be called "keeping things natural."

The more important aspect of "sustainability" is not fucking ourselves over. Plenty of non-human species have done that. There is a well-documented background extinction rate.
posted by TheOnlyCoolTim at 3:03 PM on June 4, 2008 [3 favorites]


Two transhumanist threads in two days?

Can we just get this over with and classify this thing as a religion--like Falun Gong, Raelism, or Heaven's Gate?

After all, it has just the right amount of futuristic, neo-Scientology ingredients to become one: vague, sanctimonious, new-age drivel about a coming "singularity," an unshakable conviction in technology as metaphysical liberation, a leader (Kurzweil), self-mythologizing followers who take themselves too seriously, an inability to be pinned down, geeky sci-fi themes like nanobots and brain chips, etc. It's as American as apple pie, Star Trek, Intelligent Design, Area 51, or Erhard Seminars Training (otherwise known as EST).

Apparently, this essay is becoming "canonic" in the transhumanist community

If this "essay," an unremarkable one-note attempt at Swiftian levity, is "canonical" to the "transhumanist community" (whatever that is), then that community is already well on its way to becoming a parody of itself.
posted by ornate insect at 3:10 PM on June 4, 2008 [6 favorites]


What ornate insect said.

no critic of transhumanism has responded to it (apparently, because no one has taken it seriously)


OH SHIT THAT MEANS LAME FAUX-SATIRE TOTALLY MAKES A GREAT ARGUMENT
posted by nasreddin at 3:16 PM on June 4, 2008


No one has responded seriously to this either!
posted by languagehat at 3:32 PM on June 4, 2008 [1 favorite]


You can't take away people's gods. Even while they scoff at their old beliefs they set about creating new ones.
posted by tkolar at 3:51 PM on June 4, 2008 [4 favorites]


I wonder how SENS deals with being hit by a bus?
posted by scruss at 4:00 PM on June 4, 2008


That logarithmic plot of "significant paradigm shifts" irritates me every time I'm reminded of it. The only inference one can draw from it is that the frequency of such "milestone" events falls off roughly exponentially the farther back in history you look; but that's just a statement about the distribution of events that were picked to be in the list, which is in turn a subjective statement about what counts as a "milestone".

It's a classic example of how statistics can be abused, because it tacitly assumes that the historical significance of an event is independent of the historian's proximity. Ask an astrophysicist for a similar list of paradigm shifts, and you'd get a very different distribution (hint: we missed the peak of the exponential curve by 13.5 billion years). Don't get me wrong: I'm not arguing that the pace of technological development isn't accelerating, but I do think it's ridiculous to point to history and evolutionary biology in order to justify some fairy-tale distorted version of Moore's law.
posted by teraflop at 4:08 PM on June 4, 2008 [4 favorites]


Transhumanists do come off as starry-eyed born againers....but that doesn't mean science and technology won't transform humanity.
posted by DU at 4:10 PM on June 4, 2008 [6 favorites]


Even before I saw that the article was endorsed by Aubrey de Grey I thought of it in terms of a bit of attention-seeking silliness, rather than something to spend time considering. Nobody has responded to it because they probably have better things to do with their time.
posted by nowonmai at 4:18 PM on June 4, 2008


Mr. Ellis suggests: "When you read these essays and interviews, every time you see the word "Singularity," I want you to replace it in your head with the term 'Flying Spaghetti Monster.'"
posted by everichon at 4:28 PM on June 4, 2008


Let us assume, for the sake of argument, that such a post-simian future is possible or even probable. Is it really a world we should want to strive for, where our very ape nature is stripped away in the name of efficiency?

Amen, brother.
posted by Greg Nog at 4:32 PM on June 4, 2008 [2 favorites]


Okay.

I have been reformed from my trans-simian ways, brothers. Last night I ate a rotten banana and had the most horrible nightmare. Yes, a nightmare, a dream of such horror that I might have to invent some new method, like chiseling it into rock, to make sure it remains unforgotten.

What I saw in my dream was the post-simian universe, a vision of such depravity and corruption that I could never have imagined it in my more idealistic youth. In this dream there were creatures resembling apes, yes, though they were misshapen and bald. Fortunately their technology had created "clothes" with which they could hide their deformities.

But what of the deformity behind the overly flat faces? These post-simians knew not of rutting season and mated at any time whether the female was fertile (or even receptive) or not. These post-simians knew not the primal joy of flinging poo at your enemies, for what they fling are weapons of death. They kill anything that crosses them, and most especially each other, and have a most unpleasant disregard for the very trees to which they owe their existence.

These post-simians are so alien in consciousness that they seal themselves up in sterile spaces without foliage or stimulation or little animals or scents other than what their machines make to their simple specifications. What manner of simian would want such an ugly, simple, deprived existence? Not to know the joy of swinging from branch to branch, not to know the sweetness of fruit plucked from the branch, not even knowing the joy of the end of the rainstorm. For these post-simians it would never rain, for it never rains in their sterile enclosures.

Their only pleasures, I saw, would be perversions. Burying themselves in abstractions, toying with extremes of pleasure and pain and awareness, one could scarcely call them simian at all. The simian mind recoils at the horror -- and yet, and yet, these creatures are what we would become, should this Change manifest.

Yes there are great powers out there to be harnessed. But before we embrace this idea of a post-simian future, we must ask ourselves if we really want to transcend simianism. Because the creatures we become might be powerful indeed, but we might not recognize what we become as being ourselves.
posted by localroger at 4:34 PM on June 4, 2008 [4 favorites]


Something tells me that "our very ape nature" would persist into any venture we undertake.
posted by everichon at 4:35 PM on June 4, 2008


I may be mistaken, I don't care enough to correct myself or be corrected

I'm so glad we're going to have a mature, reasonable discussion.
posted by Skorgu at 4:57 PM on June 4, 2008


If these are the kinds of responses transhumanists get when they say anything, no wonder they feel as though no one's taking them seriously. Ornate Insect's comments read almost exactly like the parody linked here. There's no content in OI's criticism, and in order to make any sense of it, you have to already believe that transhumanists are barmy. Just putting the word "singularity" in scare quotes doesn't make it wrong. (Besides, while unshakable convictions are common in religions, unshakable convictions in technology as metaphysical liberation are not particularly common in religions or cults. Also, as much of a problem as I have with aspects of many organized religions, I think that it's important to distinguish between more cult-like belief systems like Scientology and other organized religions.) If you think something is ridiculous and many people disagree, you really need to break down why you think it is ridiculous. Just repeating the ridiculous claim and saying "isn't that ridiculous" doesn't really carry much heft as far as argumentation goes. That's the whole reason the satire was written.

I'm not actually defending transhumanism (I don't think there's much to defend there), but I do feel they should be taken seriously. Transhumanists are not like Intelligent Design advocates who have been refuted seriously time and time again, to the point where even engaging them seriously feels like giving them too much credit.

My main problem with transhumanists is not the claim that there is likely to be a dramatic change that will leave us in a position that we can't now imagine. That seems at least plausible, and it's really the only argument implied in Thog's letter (which is mildly amusing at best). I think this is part of the reason nobody is responding seriously to it. But transhumanism in general does need to be responded to seriously, so I'll try to do that.

Such a dramatic change, while possible, hardly seems inevitable. Plenty of systems that demonstrate exponential growth eventually hit ceilings (sometimes unexpectedly) that severely stunt further growth. We might hit one with computing power soon, later, or not before something dramatic happens. I don't really feel qualified to predict this. I don't really think anyone is in a position to predict this aspect of the future with the certainty that the most vocal transhumanists have. Like Turgid Dahlia, I also have a problem with those who would use transhumanism as an excuse to avoid trying to solve all our problems, but that's not a criticism of its plausibility, just its use.

Will we soon create AI intelligent enough to assist us in improving AI? Maybe. If, as a species, we're still functioning on all cylinders 50 years from now, we'll probably have gotten there. It seems conceivable that it could happen in my lifetime, but I wouldn't put money on it. If we do create such AIs, will it cause an explosion in the growth of the technology. Seems like a reasonable thing to expect, if a nearly impossible thing to prepare for. But it's also possible that we'll be close to some ceiling that makes it difficult to continue.

Will we soon create self-replicating nanotechnology? My answer here is about the same as for AI. Probably not in the next 10 years; if nothing ridiculous happens (a serious possibility, mind you), then probably within the next 50. Will it change everything? Probably. Possibly not. Might change everything so much that we destroy ourselves (always a possibility). Too hard to say.

I guess that's really the key to my view on transhumanism. It's way too hard to say whether/when we hit a singularity, and since almost by definition, we can't really say what happens after the singularity, it's pretty much impossible to prepare for. So we really should spend our time preparing for the lack of a singularity.
posted by ErWenn at 5:09 PM on June 4, 2008 [5 favorites]


Localroger, your comment might have more impact if apes didn't already rape, murder, and war.
posted by ErWenn at 5:10 PM on June 4, 2008


I'm so glad we're going to have a mature, reasonable discussion.

I don't know crap about ride-on lawnmowers either, and don't want to, but I'm confident enough to be able to warn people about sticking their foot in the spinning blade.
posted by turgid dahlia at 5:16 PM on June 4, 2008


The reason why no one takes transhumanism seriously is because it seems essentially like a boob job for your brain, or your DNA, or whatever. Aubrey de Grey is a crank, a long haired foil for our fear of human finitude.
posted by doobiedoo at 5:36 PM on June 4, 2008


ErWenn--There was a thread yesterday related to transhumanism; did you read it?

I've been to the WTA website a number of times the past two days, googled and read a handful of links related to the subject, and re-read the wiki page a few times as well. And I read the links in the FPP from yesterday.

The wiki page refers to transhumanism as a "movement." Is it a scientific movement? The answer to this would seem to be an unequivical no.

Despite transhumanism's fascination with biotech, cyborgs, anti-ageing, etc, there is no scientific discipline called transhumanism that is recognized by any of the world's scientific bodies, agencies, or academic groups. For the most part, the directors of the WTA, which appears to be transhumanism's official organ, are not scientists, and even if they were it's impossible to point to a single scientific "discovery" that can be attributed to transhumanism. So we can discount transhumanism as a science.

Which leaves philosphy. Here, just as a few scientists have been drawn to transhumanism, so too have a few philosophers, but transhumanism is little more than a popular buzzword at the margins of academic philosophy, and as someone with an interest in philosophy (and also in cybernetic and semiotics, I should add), I must say I find transhumanism's ideas to be largely superficial, facile, and mostly impoverished. But I'm willing to be convinced otherwise.

I am interested in classifying transhumanism as a religion not because I wish to negate it (I don't dislike religion per se), but because I seek clarification of what it is. As a science, it's a non-starter: there is no "science" called transhumanism, despite any pretensions transhumanists may have otherwise. As a philosophy, it's a hodge-podge of futuristic assertions about a coming transformation in biotech. Which is to say, it's philosophically suspect.

I've read some of Kurzweil, but not yet read his main book. It looks better than Dianetics, certainly, but it also seems like yet another "Future 2.0" jingoistic book that publishers love so much.

In the NYT article I linked to above, one reads:

Natasha Vita-More, the first female Transhumanist philosopher, has been pondering such questions for some time. “Transhumanism” (a brave new word) was defined in 1957 by Aldous Huxley’s biologist brother Julian Huxley as “man remaining man, but transcending himself, by realizing new possibilities of and for his ‘human nature.’ ” The Transhumanist movement was formalized by a group of futurist artists, scientists and philosophers in the 1980s. Their mission: To support the use of emergent technologies to make humans smarter, faster and stronger.

“Human nature is at a crossroads,” Vita-More claims. Evolution, she argues, will inevitably go high-tech: “In the coming decades we will experience a radical upgrading. . . . Genetic engineering, biotechnology, nanorobotics (microscopic robots inside the body) will bit by bit replace the fully biological body.”


The one consistent theme of transhumanism is that we are on the cusp of something big. This is what de Grey and others seem convinced of. I have no doubt that human beings will continue to change the world we inhabit (often in negative ways, btw), and that, as DU said above, "science and technology...[may continue to] transform humanity," but this is just a trivial thing to say. I mean, no shit.

Furthermore, human psychology, especially things like the desire to live forever or have sex with extraterrestrials or what have you, is essentially unchanged. Transhumanism seems to want to cozy up to science, but it's not clear what--other than buzzwords--it offers in return.

Strangely, transhumanism seems to conflate technological innovation with science, and reverses their importance. The former is dependent on the latter, and a great deal of science does not concern itself with human beings at all. Following the Kuhnian model of scientific paradigm shifts, it makes little sense to trumpet them before they occur. If they occur, they occur. If not, not. Until then, the futuristic hyperbole and vaguness of transhumanism would seem to limit its potential as anything other than a pseudoscientific fad.
posted by ornate insect at 5:47 PM on June 4, 2008


ErWenn, humans already do most of the stuff they do in my in-some-quarters-infamous post-Singularity novel too, but people find it disturbing for some reason. I'd link it for you but self links seem to be frowned upon around here.
posted by localroger at 6:02 PM on June 4, 2008


Reasonable self-links in the comments are permitted, but you don't have to bother.

The Metamorphosis of Prime Intellect

I think transhumanism or Singulitarianism are reasonable and useful interests in the same way as Jules Verne exploring the concept of a moon shot before there were fields of aeronautics and rocketry. And, you see, part of this is getting it wrong, because Verne used a cannon. It's great that the hypothetical future technology isn't necessarily presented in the context of a fictional story, because I'll bet a lot of these guys coming up with ideas like Matrioshka brains have no business writing fiction, but the ideas are still interesting. (Juxtaposition of that sentence with localroger's book is unintentional.)

When they stop looking at it as a possibility or something to work towards and start taking everything they talk about as an inevitability, thinking there will be no problems in our transhuman future, or too-unrealistically hope for personal salvation through uploading, it gets a bit religious.

I think it quite unlikely there'll be a recognizable "Singularity" we may point to and say "This was the singularity and two days/two years later EVERYTHING WAS DIFFERENT." There'll be a continuum. Localroger's story has that sort of catastrophic event, but if I remember the opening right and I know I have the gist, it involved an already human-equivalent AI discovering a deus-ex-machina way to almost instantly increase its power at will and, by the way, control and reorder the very substrate of existence.

If you give it some consideration, pretty transhumanist things have and are happening in reality. I used to remember phone numbers, and I still have some in my head from childhood. Nowadays I can only remember my own, because my cell phone is doing memory augmentation. Sensory augmentation has been done - I don't have the links but I've read of things like devices and implants that give people a compass sense which becomes useful to the point that they're a little disoriented when it is taken away. That will continue, and I think it's quite likely we'll have brain-computer interfaces at some point, and then we could have memory augmentation implanted. And, in a way, these things have been going on for a long time. What's the fundamental difference between augmenting my memory by having a chip that I access with thought or even without trying, by putting things into cellphones, or by writing them down?
posted by TheOnlyCoolTim at 6:43 PM on June 4, 2008 [1 favorite]


I've not read localroger's AI/Singularity themed novel, but when reading the section of the link TheOnlyCoolTim provided entitled "discussion of the how the ideas in the novel intersect with modern Singularity theory," with passages such as--I believe consciousness is most likely an emergent property of a relatively simple system which, in the higher animals and humans, appears complicated only because the system is capable of storing a large amount of information. If this is the case, then it won't be designed; it will be discovered"--I was reminded of another novel which tackles these themes, Les Particules élémentaires by Michel Houellebecq.
posted by ornate insect at 6:58 PM on June 4, 2008


I should be careful. In my previous long post, I guess I was responding more to the idea of the singularity and the contact I've had with people who would call themselves transhumanists than to any of the big books on transhumanism or the rhetoric of the very well-known transhumanists. Also, I was not aware of yesterday's discussion, so perhaps the responses here seem so tired because more serious discussion happened there.

I think Ornate Insect is correct in saying that transhumanism is not a science, and I hope that transhumanists wouldn't find that offensive. Many are using scientific results to inform their belief that a singularity is inevitable (or at least likely), but specific predictions are not being made (in fact, there's a lot of talk that such predictions are impossible), so while it would be silly to say that there is no science in transhumanism, it would also be silly to say that transhumanism is a field of science. I've also noticed that most transhumanists think that the singularity is a desirable thing. I don't think that any reasonable transhumanist would claim that a value judgement such as this is science, just as I would never declare my belief that global warming should be avoided science. But I never really noticed this as a major issue anyway.

Doobiedoo, I don't see how a boob job for your brain would be something that we shouldn't take seriously. Though you're probably right that many transhumanists find reasons to believe in the singularity because its existence implies the existence of an alternative to death. That doesn't actually negate their arguments, however.

Localroger, I actually read a couple chapters of your novel years ago. Didn't realize that it was yours until you mentioned it just now. I didn't find the the acts of the characters disturbing in a moral sense, but I found the detail in which some of those acts were described a bit off-putting. If you're familiar with the word "squick" (the first definition in that link is the one I'm using), that describes my reaction pretty well, and one of the reasons I didn't finish. In any case, as an extension of the original Thog allegory, your original comment is fine. I think I misinterpreted it as a response implying that the change was a bad thing.
posted by ErWenn at 7:04 PM on June 4, 2008


TheOnlyCoolTim--All human technology and tool-use, from language and fire to microchips and satellites, is an augmentation, extension and alteration of human cognition.

If all "transhumanism" refers to is the way in which technology, construed broadly, extends, augments and alters human thought and action, then I'm not sure singularity was not fully achieved the first time a primate picked up a rock or gathered a termite with a reed.
posted by ornate insect at 7:10 PM on June 4, 2008


ErWenn -- you might want to look at it again starting at chapter 2. I doubt you got to Ch. 2 because most people who do don't stop reading :-)
posted by localroger at 7:35 PM on June 4, 2008


I'm not sure singularity was not fully achieved the first time a primate picked up a rock or gathered a termite with a reed.

Exactly. To me "transhumanism" is an interest in looking ahead in (at least certain technological forms of) the "augmentation, extension and alteration of human cognition" and the world we live in, which is a fine thing to be interested in.
posted by TheOnlyCoolTim at 7:35 PM on June 4, 2008


localroger: I loved Metamorphosis when it was on k5, I had no idea you were on the blue.

turgid dahlia:I don't know crap about ride-on lawnmowers either, and don't want to, but I'm confident enough to be able to warn people about sticking their foot in the spinning blade.

Pardon me but in light of I'm a critic of transhumanism only insofar as it doesn't solve any of the actual problems facing us today and effectively gives us carte blanche to treat the environment like shit because one day we're all going to be nodes in a Matrioshka Brain so who gives a fuck? you're arguing against riding lawnmowers as a concept because they hurl spinning blades of death at unwary pedestrians. And brandishing your ignorance proudly.

Srawmen abound, but of course there are no doubt some who make just those arguments. Asking ten people what transhumanism is will likely net you on the order of a dozen answers. None is more than an opinion. There is no Grand Council of Transhumanity, certainly not any more than there is a High Pope of All Religion or a Great Senate of Science.

Anyway, my opinion is that the promise and danger of AI and human augmentation are far closer than we think they are and merit talking about. Likewise thinking and planning are things that we as humans are horrible at and in areas where there are solid, objective, demonstrable benefits to adopting a less human and more accurate methodology we should (selectively, carefully) kick that bit of humanity out the door. I'm also all for cyborg enhancements because I'm a pasty white boy and all for removing fallible, emotional humans from positions of power because I'm a misanthrope.

I can be of use rounding up other humans to work in the, um, Bayesian salt caves?

On preview: ErWenn: I don't think that any reasonable transhumanist would claim that a value judgement such as this is science I don't know about reasonable transhumanists but value judgments are no less science than any other determination of fact. Getting people to agree on proper valuations of risk and reward is non-trivial but that's human nature for you.
posted by Skorgu at 7:39 PM on June 4, 2008


TheOnlyCoolTim--which is it? Either transhumanism lays claim to some huge techno-cognitive leap just on the horizon, due to AI/biotech/nanotech/etc, or it's a far more modest and fairly trivial claim about technology in general. The term, and especially the new-age snake-oil term " the singularity," is so vague it encourages this kind of confusion.
posted by ornate insect at 7:41 PM on June 4, 2008


Transhumanism is not the same as Singulitarianism. Transhumanism does not require a huge leap on the horizon, and I think having a discussion about say, the possibilities, benefits, and dangers of biotechnology is not something to dismiss as trivial.
posted by TheOnlyCoolTim at 7:53 PM on June 4, 2008


having a discussion about say, the possibilities, benefits, and dangers of biotechnology is not something to dismiss as trivial

Where did I do that? Is that really what transhumanism is after: an impartial discussion of bioethics and new technologies? If so, the enthusiastic jargon of transhumanism's futuristic technophilia very much obscures this aim.

Asking ten people what transhumanism is will likely net you on the order of a dozen answers. None is more than an opinion. There is no Grand Council of Transhumanity, certainly not any more than there is a High Pope of All Religion or a Great Senate of Science.

Is asking for a reasonable and largely agreed-upon definition for transhumanism really so misguided? To my mind the word is so wooly it invites all sorts of misinterpretation, and yet the hermeneutics of polysemic subtlety does not seem to be transhumanism's strong suit. There's a lot of grandstanding surrounding this term and its relations: more ideaological baggage than descriptive clarity.

The words "religion" and "science" are not so heavily riddled: most people can agree more or less as to what they refer to. In my estimation, transhumanism is not science, but very much appears to be like a quasi-religion (although I admit I could be wrong).

I'm not against complex words, but as concept-clusters the terms "transhumanism" and "the singularity" are being used in some heavy-handed and silly ways in the links and threads I have come across. Just saying.
posted by ornate insect at 8:14 PM on June 4, 2008


while I may be mistaken, I don't care enough to correct myself or be corrected.

I will treasure this sentence fragment for all time. It's so... all-purpose. From now on, whenever I venture into any discussion of politics or religion, on MeFi or elsewhere, I will imagine that all the participants solemnly raised their right hands and quietly intoned this sentence fragment three times in unison, just before I arrived.
posted by ook at 8:18 PM on June 4, 2008 [3 favorites]


Is asking for a reasonable and largely agreed-upon definition for transhumanism really so misguided?

I am only a Pope in Discordianism, not transhumanism, so I'm not the boss here, but I think a reasonable definition might be that it's an interest in considering and to some extent advocating and working for radical changes due to technology. That some people get overly geeky, hopeful, or overconfident about these changes does not invalidate the whole idea.
posted by TheOnlyCoolTim at 8:19 PM on June 4, 2008


Skorgu
I don't know about reasonable transhumanists but value judgments are no less science than any other determination of fact. Getting people to agree on proper valuations of risk and reward is non-trivial but that's human nature for you.

I think you misunderstand. He was referring to moral judgments, not risk-benefit analysis. He was saying that many transhumanists desire the Singularity or whatever as a morally good thing, which is not science.
posted by Sangermaine at 8:26 PM on June 4, 2008


I'm not trying to be a stickler here, although I'm sure it seems that way, but

an interest in considering and to some extent advocating and working for radical changes due to technology

is, besides being exceedingly vague, distinctly different from having an impartial discussion about

the possibilities, benefits, and dangers of biotechnology...

and I think this difference may cut to core of the problem: transhumanism seems to be selling the promise of genetic, nanorobotic, and other technology in extending and augmenting human life, but when pressed on specifics seems to retreat on its enthusiasms.

I have yet to read Kurzweil, but throw Wells, Huxley, Toffler, Asimov, Crichton, William S. Burroughs, Phillip K Dick, William Gibson, Tron and The Matrix into the mix, and one begins to get the picture. As fiction, it has a certain familiar appeal, but I'm not sure it's gotten much beyond that yet.
posted by ornate insect at 8:37 PM on June 4, 2008


Thank you, Sangermaine, for clearing up that confusion.

Ornate Insect: "enthusiasm" is a fantastic word.
posted by ErWenn at 9:49 PM on June 4, 2008


Cute article! :)

Transhumanism is fundamentally an attempt to challenge religion with something that results in more money being spent on science & technology. In particular yes transhumanism is an anti-charity "rising tide lifts all boats" philosophy, but no it is not a market economy philosophy.

I don't feel that transhumanism gives carte blanche to destroy the environment, new technologies are the best long term answer to environmental problems. Some extreme transhumanism actually challenge market economics by suggesting that only data will have value eventually. Otoh, yes, some people build justifications for market economics and attacking the environment from any vaguely amenable philosophy.

I don't really imagine strong AI arising all that soon, sure the computers may have the cpu power, and we may eventually understand many algorithms of the human, but some algorithms will hold out significantly longer. For now, I want to see more investigation of animal brins towards two goals :

1) biological tweaking -- genetic & drug research towards making people smarter, etc.
2) implants -- implanted computers that interact with the brain to improve some specific function
3) parallelization -- implants that allow large numbers of people to think in parallel about hard problems

No one talks about the last one, but it's maybe our best short term hope for super-intelligent beings. Trying to live forever is kinda beyond stupid at this point, but I suppose rich people need crazy hobbies on which they can blow lots of cash.
posted by jeffburdges at 10:07 PM on June 4, 2008


I'd link it for you but self links seem to be frowned upon around here.

They are only frowned upon as posts. In comments, if germane to the conversation, they are perfectly fine, provided you didn't start a thread in order to make them.
posted by stavrosthewonderchicken at 1:22 AM on June 5, 2008


Will we soon create AI intelligent enough to assist us in improving AI? Maybe. If, as a species, we're still functioning on all cylinders 50 years from now, we'll probably have gotten there.

Strong AI has been 30-50 years away since the end of WWII. In that time the computing power available to humanity has been riding an exponential curve, yet it's still 30-50 years away. I love the whole Singularity thing as my own personal Sci-Fi Rapture, and it pushes lots of my wish-fulfilment buttons, but I don't see any evidence that it's actually going to happen.
posted by Leon at 3:48 AM on June 5, 2008


Transhumanism as a concept may not be as young as you think: In H.G. Welles' War of the Worlds, the Martians have evolved to the point where their physiology is elementary, capable only of thought, communication and basic manipulation. For interacting with the environment they use different machines, "wearing" them like bodies in which they are the brain. Of course, this is presented in a thoroughly negative manner.

Given the time period it is set in, this is essentially Steampunk Transhumanism, a powerful nexus of geekery
posted by ghost of a past number at 4:32 AM on June 5, 2008 [4 favorites]


He was saying that many transhumanists desire the Singularity or whatever as a morally good thing, which is not science.

If you're saying that the desiring itself isn't science, I think I agree.

But the determination of moral good isn't somehow exempt from scientific inquiry, c.f. behavioral economics.

In any case, quite a digression. As TheOnlyCoolTim points out, conflating transhumanism (whatever that is) and the singularity (whatever that is) is leading us all to confusion. Mixing in religion (whatever that is) and science (whatever that is) sure isn't helping.
posted by Skorgu at 8:06 AM on June 5, 2008


I do think a lot of transhumanists come off very much like religious fanatics. Whether this is the core of transhumanism or not is a different discussion, but the fact remains that there appear to be a lot of people substituting "God" for "Singularity" and "post-Singularity" for "post-Revelation".

And, there are some problems with the concept of the Singularity. While I am a staunch monist, and fully expect that one day (maybe even soon) we'll have a complete understanding of the human brain, while I don't accept at all the contention that there is something magic or special about the brain, it does not automatically follow that we'll be able to design better intelligence. More important, there is no reason to think that, assuming we *can* design a superior intelligence that it will be able to build an intelligence superior to it in a shorter time than it took us to design the first superior intelligence.

The Singularity model requires us to think that intelligence will expand at an exponential rate, while the thinking time required to design the next intelligence will remain consistent, or will increase slower than intelligence increases. While that may well be the case there is no reason to assume it will be the case.

Let me illustrate. If we call human intelligence H, so far we've been working on building something that matches H for going on 60 years now with no success. Assume we succeed tomorrow. Assume we build something with H^2 intelligence in 20 years. Why would we assume that H^2 can build H^3 in anything less than 20 years? Maybe the problem is sufficiently difficult that it'll take even something with H^2 intelligence 100 years to solve. And when H^3 is built, there is no reason to assume it can build H^4 in less time than it took H^2 to build it. If that's the case then there is no Singularity, just a steady progression of intelligence.

Of course my musings may be completely wrong, but there's no reason to assume that the believers in the Singularity are any more right. There's no evidence either way.

The idea that we will augment human mental performance, or build superior AI seems perfectly reasonable, but the transhumanists seem like just another flavor of salvationists. They see that as humans we have problems and rather than trying to solve the problems we have they are looking for an external savior. Christians say its Jesus, Buddhists say its Siddhartha, and the transhumanists say its the Singularity. I have not yet found a flavor of salvationism that isn't essentially a religion.
posted by sotonohito at 8:47 AM on June 5, 2008


It is amusing that each side of this debate accuses the other one of assuming a religious position. If this discussion was happening 800 years ago, and we were all monks scattered all around medieval Europe, scribbling furiously on parchment freshly scrubbed free of the heathen works of the ancients, we would probably accuse each other of being in league with the devil.

I think that the transhumanists are to blame for this though: excessive handwaving in their arguments invites this kind of cheap tactics from both sides.
posted by ghost of a past number at 9:37 AM on June 5, 2008


sotonohito absolutely. The transhumanism label doesn't obviate Sturgeon's law by its mere application. Of course conflating the group and the loudest individuals is a fundamental problem of human interaction so I doubt we'll solve it here :)

I agree fundamentally that we can't know how fast intelligence can self-modify. However, the common parallel to the industrial revolution applies as guidance I think. There's no evidence to assume that creating intelligence is bound by anything but intelligence.

The central assumption is that intelligence and time are inversely fungible; you can throw more brainpower at a problem or you can throw more time at a problem and get the same result. All our evidence (admittedly not very much) suggests that this is, if no an iron rule, at least a good guideline. Every human-beating computer in any area of expertise does so by thinking faster, considering more chess moves in a given time for example.

With that assumption, the relationship between thought speed and time is the single governing factor to the self-improvement rate of intelligence, ceteris paribus. Yes there may be limitations on AI design and cognition speed that we have yet to discover, but postulating those violates Occam's razor. The limits of computation we know of now are so far overhead there's no reason to imagine glass ceilings, especially if we are able to understand enough of the behavior of systems to model them at the functional instead of atomic level. Fiber optics vs 200mph nerve signals, etc.

Like all futurism it's totally doomed from the start in terms of actual predictions. Better to assume we will be able to make AIs than be surprised by their existence.
posted by Skorgu at 10:05 AM on June 5, 2008


I'm glad to see the discussion here has gotten more substantial since my (perhaps misguided) initial complaints.

Skorgu, while I agree with almost everything you say, I think you're conflating the absolute theoretical limits on computational power (which are indeed still quite a bit far off) with the practical limits we've already encountered (limits of materials, techniques, etc.) We have indeed overcome quite a few of these practical limits already (by changing our materials, inventing new techniques, and otherwise being clever), but that doesn't necessarily mean that we can overcome every such obstacle just as rapidly. There is that theoretical impassable ceiling, but there may be some surprisingly resilient (perhaps requiring exponentially larger amounts of cleverness*) intermediate ceilings that will slow us down before we get there.

Another obstacle we may run into is that many clever solutions we discover are built upon experimental evidence, so that may bottleneck the progress of transhuman-speed intelligences as well. So many unknowns.

*Not that I am aware of any reasonable way to measure intelligence linearly, but I think my point is clear. Let me know if it is not.
posted by ErWenn at 10:22 AM on June 5, 2008


The idea that we will augment human mental performance, or build superior AI seems perfectly reasonable, but the transhumanists seem like just another flavor of salvationists. They see that as humans we have problems and rather than trying to solve the problems we have they are looking for an external savior. Christians say its Jesus, Buddhists say its Siddhartha, and the transhumanists say its the Singularity.

Understandably when talking about philosophies it's often hard to avoid this... but you have to remember that if you ever go to any extreme for your information, it will seem fanatical - remember Skinner and Pavlov? Remember Walden Two? Let's not forget that we exist within a certain paradigm today that may arguably be somewhere between an empiricism and a realism.

You have to realize though that the vast majority of people subscribing to some sort of transhumanist viewpoint are not fanatical in their perspective, and are somewhat more grounded in the present, as opposed to being caught up in the hopeful future (that is, their life doesn't revolve around making Futurist predictions like Kurzweil).

With that said, I don't think the salvation most transhumanists might consider is cast off as being a Jesus-machine that will some day rise from an exponential growth in computing power. I think that most transhumanists would rather say that humanity's salvation lies in its intelligence, and that the most obvious way to quicken our salvation is to increase the qualitative and quantitative intelligence that humans, as a whole, can be said to possess, and this is a perfectly logical notion.
posted by tybeet at 10:28 AM on June 5, 2008


« Older The Black and White Minstrel Show was a (very c...  |  Later this year, geophysicist ... Newer »


This thread has been archived and is closed to new comments