Singularity
May 6, 2005 11:48 AM   Subscribe

According to the developmental spiral we are heading towards an unfathomable point in time known as singularity. Could the futurists and science fiction writers such as Vernon Vinge be right?
posted by ttopher (56 comments total)
 
Of course they're right. Just kick back, and enjoy the ride.
posted by iamck at 11:52 AM on May 6, 2005


I have virtually limitless informational resources, free or hyper-cheap media entertainment, instantaneous golbal communication, and a George Foreman Grill that can fully cook a burger in less than three minutes with up to 65% less fat.


You know what I don't have?




A GODDAMNED FLYING CAR.

What the shit?
posted by stenseng at 11:54 AM on May 6, 2005


Right on brother! The moment is perfect, except for the fact that I managed to miss spell Vernor. Oh well.
posted by ttopher at 11:55 AM on May 6, 2005


You know what I don't have?

A GODDAMNED FLYING CAR



And you probably never will.
posted by ttopher at 12:08 PM on May 6, 2005


Vinge's notion of the technological singularity is widely misrepresented in sf, I think. (Pet peeve of mine.)

It isn't that technology will advance past the point that present-day humans can comprehend and predict. That's pretty much a given: it happens regularly. Vinge's observation is that the time that it takes for this to happen seems to be getting shorter. A millennium ago, for example, you could choose a century's worth of time and daily life would be pretty much the same at the beginning and at the end of it. In the Neolithic, you could probably choose a millennium's worth of time. Today, how long a span of time can you choose? Twenty years? Five? Not too long. The Vingean singularity is when that time drops to zero. It's not only incomprehensible to us, it'll be incomprehensible to people even a short time before it happens to them.

There's some sleight-of-hand in that argument that someone versed in history and/or calculus can find pretty easily. As Vinge mentions in the essay I linked, the singularity may or may not come to pass. (And it may or may not have the "rapture for nerds" quality that it has in a lot of SF.) But it's still an interesting thing to think about.
posted by hattifattener at 12:08 PM on May 6, 2005


As we go more deeply into the spiral, do we get dizzy? It would explain a lot.
posted by wendell at 12:09 PM on May 6, 2005




Maybe not a double post, but certainly a recurring theme. See previous discussions here, here, here.
posted by beagle at 12:21 PM on May 6, 2005


Bruce Sterling has a refreshing take on the whole singularity "thing." mp3 link
posted by bshort at 12:28 PM on May 6, 2005


ttopher should throw his hat over the wall.

LOL. That was quite brilliant. It is almost as if you knew I was a big fan of Clerks.

The thing I love about the universe is that I don't have to do a damn thing and it will still be alright.

Plus, who needs flying cars when we'll all have jetpacks?

beagle: I did a search for singularity before I posted and felt I could add something new.
posted by ttopher at 12:30 PM on May 6, 2005


So there's no asymptotic function in all this?

Cool! Let's start a new religion!
posted by nofundy at 12:39 PM on May 6, 2005


I thought the job of futurists was to be interesting. Can anyone document one who was actually right?
posted by graymouser at 12:42 PM on May 6, 2005


Does this have anything to do with Timecube?
posted by Mayor Curley at 12:48 PM on May 6, 2005


another interesting observation about the developmental spiral we are on is that below the equator it spins the opposite direction as above the equator
posted by ronenosity at 12:49 PM on May 6, 2005


But this all assumes that technology will be able to work out what's smart, and what's not. Never underestimate the power of artificial stupidity ...
posted by scruss at 12:51 PM on May 6, 2005


Should be Vernor not Vernon. Jus' sayin'...
posted by FYKshun at 12:53 PM on May 6, 2005


"Could the futurists and science fiction writers such as Vernon Vinge be right?"
Man I hope so. I think the spectrum of human thought is really shortening up. Not in the details of course, but we are running into our conceptual limits with the sets of ideas we have now. Sort of the "how many angels can fit on the head of a pin" inbred philosophical nonsense.
Of course some people are still using single paradigm perspectives (judeo-christian, postmodern, etc. the single book sorts) to decipher us & the world, but they're really not worth mentioning.

Frank Herbert ended the Dune series I think appropriately with the escape from prescience into the unknown.
It's only the not knowing what will happen next that makes existance bearable. Any static state eventually degenerates consciousness.

To hell with flying cars and jet packs, I want something I can't even concieve of.
posted by Smedleyman at 1:04 PM on May 6, 2005


Should be Vernor not Vernon. Jus' sayin'...

Thanks for giving me the opportunity to increase my post count be informing you that I already corrected myself earlier in the thread. Cheers.

bshort, thanks for the audio link. I was interested in gathering new perspectives in the idea.
posted by ttopher at 1:06 PM on May 6, 2005


Vinge seems to be concerned about artificial intelligence overtaking human intelligence and carrying on unchecked, with unpredictable and dangerous outcomes. Essentially, technology begins to follow an evolutionary path directed by AI. But he also says it may never happen. Just because event horizons are shortening doesn't mean they'll go to zero. Just because Moore's law is still in effect doesn't mean machines will be able to think (a prerequisite for the Singularity). When Oppenheimer pushed the button at Trinity he said there was a 1/10,000 chance he was about to destroy the world (they thought the atmosphere might catch fire) -- right now the Singularity would seem to have a similar likelihood, or less. It requires thinking machines and having those machines escape from human control, and then having the machines begin an evolutionary process.
posted by beagle at 1:31 PM on May 6, 2005


Ohh dear. There are several reasons why I wouldn't bet on a singularity.

I'll admit that Vinge has a really good grasp on technology. However many of his ideas hinge on a couple of fallacies in regards to the development of machine cognition, (or cognition of other sorts.) A big one is the fallacy that more data = smarter. In statistics, there is actually a point at which more data gives you worse results. More data reduces the probability of finding no effect where an effect exists, but increases the probability of finding a spurious and trivial effect unrelated to your hypothesis. In qualitative research, you reach a point of theoretical saturation at which more data just tells you what you already know. The big problem in "being smart" is no longer how to access data, but how to make meaning of too much data.

The second fallacy is that more individual logic units = smarter. There is this notion that when the number of transistors in a CPU exceed the number of connections in the human brain, that we will have to suddenly consider the possibility of superhuman machine intelligence. The problem with this is that new developments in robotics are going the other way. We have robotic insects that are smart by using fewer transistors and logic built into the structure of joints and legs. In addition, the flow of information processed within ecosystems such as the Great Barrier reef dwarfs that of the human mind by a few orders of magnitude. But we have not seen the spontaneous development of self-aware reefs. (Which are capable of consensus action to change weather patterns.)

The third fallacy is that singularity adocates consider technological advancement in isolation, and not techonolgy as part of an ecological relationship with other economic factors. Vinge's paper is really interesting on this in that he makes some pretty broad claims about the relationship between technology and culture, but does not cite any of the basic works on the sociology of technology. This inter-relationship makes it extremely unlikely that technology will spiral out of control, "past the point that present-day humans can comprehend and predict." The fact is that there is really no such thing as a disruptive technology.

The fact that something has demonstrated rapid, or exponential growth in the past, does not mean we should assume that such growth will continue in the future. At some point, it is quite probable that other forces will come into play, producing an equilibrium. I suspect that one of the basic limits on any technological development (and there is quite a bit of research to back this up) is that paradigm shifts require a full generation for the "old guard" to die off, and a new guard to move in. This would suggest some interesting conflicts involved between practical immortality and rapid technological advancement.

Even with that, progress is not continual, but moves forward in fits an starts. Depending on how you count things, the information revolution is 50-100 years old (1), and has been stalled for the last 30 with no new information paradigms, one new machine paradigm, no new developments in programming languages, and minimal developments in understanding basic cognition.

Now to head off the, "what about the web." My point is that it is just now that we are seeing a minimal implementation of 50-year-old ideas. And nothing widely available today comes close to the full breadth and scope of memex and xanadu.

(1) Most people argue from the first computer, I suggest that the telegraph and telephone is just as important.
posted by KirkJobSluder at 1:41 PM on May 6, 2005


I want to add on the what beagle says - It's a huge leap to say that the increase we are experiencing in computational power (and even the decrease in computational size) is at all moving toward computational consciousness. There are many philosophical models, and many more unanswered questions, about what it will take to make the jump to consciousness with computers. Right now, we don't have a computer that's near as smart as a cockroach even. And we've tried to make one that can do what a cockroach does - move intelligently around a room - with little success.
posted by Slothrop at 1:45 PM on May 6, 2005


There is definitely a lot of speculation regarding the topic. I find the idea interesting and I feel there is a certain amount of truth to it, but I don't think it will result in some kind of extreme rapture, nor the end of the world. I see the universe evolving independently of individual concerns and desires in an inadvertently perfect way. In my opinion the moment is perfect or else it wouldn't be and the only time that ever exists is the now so there aren’t really any grounds for worrying about the future. It just provides opportunity for discourse.
posted by ttopher at 1:47 PM on May 6, 2005


Could the futurists and science fiction writers such as Vernon Vinge be right?

Not if that whole peak oil thing is correct.
posted by c13 at 1:51 PM on May 6, 2005


yeah yeah yeah, but what about THE CAR?
posted by stenseng at 1:57 PM on May 6, 2005


woah there, kirkjobsluder.

a. It's not Vinge's theory. He's got his ideas and stuff, but one person's idea of it doesn't necessarily equate to the whole of singularity theory. Largely, he's writing stories to describe the consequences a type of environment would have on people. Exactly how the singularity would come about and how we experience it changes in every one of his stories. I didn't read his paper, though.

b. For most people, the idea of the singularity is represented not by intelligent machines (certainly not SUPERintelligent machines) but by machines capable of designing and creating other machines. That means for a lot of people widespread nanotechnology would be enough to herald the singularity. The reason for this is that machines designing other machines could do so so quickly that we'd be unable to keep track of all that they create and invent. Suddenly we'd have brain implanted computers and the next day they'd be old fashioned and people wouldn't even have skulls anymore, the next day people are air-born particles, etc...

Either way, the idea is that the singularity happens when life as we define it no longer adequately defines life and we are unable to re-define it because it's changing too quickly. For some people (Cory Doctorow, Ray Kurzweil) this moment will come with the advent of the ability to upload a person's personality digitally to remote storage. Couple that with cloning and you theoretically live forever. See the interview between Cory and Ray, here. (This month's Asimov's Science Fiction Magazine) Vinge's singularity stories are often not even AI related. They're actually usually about genetic alteration and rebirth.

As far as your concerns with Vinge's ideas, I think you make some good points. I would caution against assigning those concerns to all singularity theory until you've encountered more views on it. (That would be the short version of my extremely long-winded point.)
posted by shmegegge at 4:32 PM on May 6, 2005


Thanks KirkJobSluder for bringing some perspective to something that I see as becoming some sort of tenet in nerd-religion.

Ideas like this are so much a product of their time: The Universe is a big set of gears/clock/computer etc. depending on the time you live in and the tendency to extrapolate from a history not much longer than a human lifetime.

The technological present is like a mini-Cambrian explosion and it doesnt mean that the trilobites will always rule the world. With the introduction of something new in the world there has always been a hyper-explosion of things/ideas that start to repopulate all the available niches. This is what we've seen with computers since their introduction in the 20th centruy but I think its coming to a close and, if anything, there might be a period of stagnation until some new technology arrives, maybe biochemical this time, which starts the cycle all over again.
posted by vacapinta at 4:35 PM on May 6, 2005


I thought the job of futurists was to be interesting. Can anyone document one who was actually right?

He was a novelist rather than a "futurist" (whatever the hell that is), but Jules Verne foresaw the nuclear submarine in 20,000 Leagues Under the Sea. But Hyman G. Rickover -- brilliant and irascible -- was the sort of guy that nobody could've foreseen.
posted by alumshubby at 4:41 PM on May 6, 2005


schmegegge: Well in regards to point a). I like Vinge because his paper offers the only explanation of a singularity theory that makes sense. The rest are decidedly worse in my opinion.

b). fails for many of the same reasons: The reason for this is that machines designing other machines could do so so quickly that we'd be unable to keep track of all that they create and invent.

Well, here again, I think this ignores the fact that such machines would exist in an ecological and environmental context that would make such extreme turnover unlikely.

The grey goo is something that really frustrates me because it seems to be proposed by people with a good grasp of engineering and chemistry at the macro level, and no understanding of energy and matter at the micro level. As a classic example, Dexler contrasts photosynthesis to photovoltaics, not reconizing that photosynthesis exists not to produce energy, but to reduce carbon. Biochemistry can be described as understanding how microscopic systems manage and control both energy and matter. A casual review of biochemistry would suggest that grey goo is almost impossible, and runaway nanotech ecologies are improbable.

The basic reason is that while there is a lot of energy to be had in the biosphere, you gotta put up the activation energy to get at that energy. Too much activation energy is like setting off a stick of dynamyte in living room. To lower the activation energy to a reasonable level, you need a catalyst. But there is no universal catalyst for complex organic molecules. Because of the mind-boggling diversity of organic molecules in the world, it pays to specialize.

For some people (Cory Doctorow, Ray Kurzweil) this moment will come with the advent of the ability to upload a person's personality digitally to remote storage.

Well, I'm a skeptic on this also. One of the implications of Poincare's description of the three-body problem is it is not always possible to model dynamic and chaotic systems. There is no reason to assume that the brain is not chaotic. In fact, chaos theory could explain why the act of narrative memory produces slightly different veriations on a theme with each rehersal. If the brain is chaotic, it would suggest some basic limits on personality upload.

Kurzweil also seems to invoke the magical notion that technology breeds new technology. Which relies on a pretty selective viewing of history. There are lots of cases in the history and pre-history of the world where all of the prerequisites for the next technological advancement existed, but were never adopted or applied.

Of course, I could be wrong. But my money is that if "everything changes" in my life time, it will be due to an ecological, not technological disaster.
posted by KirkJobSluder at 5:45 PM on May 6, 2005


I think people tend to take this idea to massive extremes, and I think your response is to an extreme interpretation, KirkJobSluder.

Large, massive, parallel computing networks do result in amazingly adept systems. The growing power of hardware and the growing cheapness of hardware means that embedded, massive wireless networking is more or less inevitable. Humans and computers do become more integrated as time goes on and interfaces become more natural (there have been great advances even recently in neural interfaces.) Genetic tech has amazing possibilities we probably can't even guess at at this point.

Any and all of these things all very quickly lead us to greater advances in research and productivity. Their development has been triggered by previous advances in research and productivity. This is the closed cycle that -- it seems inevitably to me -- leads to a 'singularity'. Faster research, leaps and bounds upward in productivity, more and more developments in technology ... eventually things will be advancing so quickly that it will be practically impossible to keep track of the progression, and that will be it.

William Gibson used to write, back in the early 80s, about bizarre (at that point) techno-dystopia futures. These days he writes about next year. ish. Most 'science fiction' at this point is not wild, incomprehensibly advanced amazing stuff, it's all pretty plausible now, and not set in some wild future. The gap between 'stuff that might happen at some point' and 'stuff that's happening now' is closing very quickly. What will things be like in a hundred years? ... There's just no way to know. And that "opaque wall across the future" is the conceptual equivalent of the singularity. What is there to predict, now? I can have anything I want, basically, now, just by virtue of being a citizen of a first-world country. No, there are no flying cars, and no teleporters, and no Mars colony, and no e-paper, but the list of things I have no access to and really desire in my every day list is very, very short. Human development is absolutely approaching a new level.

That, or a really, really boring plateau. But I highly doubt that. We're very innovative.

On preview: Grey goo and Kurzweil are all very out there. I don't think that's the sort of thing a Vinge-style singularity requires at all. Ecological problems will be overcome, I am sure. And Doctorow is a hack.
posted by blacklite at 6:07 PM on May 6, 2005


The Singularity is oh so 90's
posted by troutfishing at 10:02 PM on May 6, 2005


blacklite: The problem is that most of the claims you make in your post still fall into one of the three fallacies I mention. Massive-parallel computing isn't going to do much. A flock of birds is an example of massive parallel information processing. With more complexity, you go from colonies to societies like ants and termites. Intelligence is not a factor of how many connections are in a system, but the structure of the system. And the breakthroughs in how to structure those systems are coming even slower that the breakthroughs in hardware.

Their development has been triggered by previous advances in research and productivity. This is the closed cycle that -- it seems inevitably to me -- leads to a 'singularity'. Faster research, leaps and bounds upward in productivity, more and more developments in technology ... eventually things will be advancing so quickly that it will be practically impossible to keep track of the progression, and that will be it.

This fails for the basic reason that technology can't advance its self outside of economies and ecologies. There really isn't that much new under the sun in information technology. All of our programming languages are either 30 years old or derived from 30 year old models. Likewise with interface design. The ubiquity of technology tends to work against paradigm shifts, in that a radically new paradigm has to do a lot of work to survive, much less compete.

Ecological problems will be overcome, I am sure.

I don't know about that. Ther are some potential ecological problems around the corner that probably could be overcome, but not without the sacrifice of a huge quantity of productivity and/or human population. For example, a run-away greenhouse effect in which warming temperatures results in the release of CO2 from ocean sediments or permafrost would be a major whammy.
posted by KirkJobSluder at 10:54 PM on May 6, 2005


KirkJobSluder:

well, that was well said.

Now, I tend to agree with you regarding the idea of digitizing human cognitive ability. I don't see it. What I see as the fundamental problem is that everyone's idealized visions of this always involve hardware and software that is wildly out of sync with reality.

For instance: Let's take AI. As much as I like Asimov's stories and the stories of scores of other writers, the idea of metal people walking around holding conversations with us is improbable for one reason: We can have metal people walking around with us as early as 10 years from now, but even our best processors can't pretend to hold a decent conversation, much less do so while controlling a whole robot autonomously. The progress on AI has been so slow compared to the progress on every other technology I can think of that its progress is virtualy negligible. I mean, people are talking about how video games are revolutionizing the PROFESSIONAL AI industry.

Same for the idea of digitizing people. I do think that the idea of storing how a person would think and his/her memories might theoretically be possible, but accessing it? I don't see it. What is our progress on technology for reading minds? None. What is our progress for understanding how the brain works? Compared to the progress of the storing and interpreting of data, virtually negligible.

However, where I disagree with you is that I believe this will contribute to a period where we have difficulty keeping up with the world around us. I think these fractures in our technological progress are evidence of an ability to keep up already. I think we move too quickly in raw processing and not quickly enough other matters.

I think the evidence is there in things like 3d animation. It's like a period of slow evolution in cell animation, then BANG! Punctuated Equilibrium. We have 3d animated movies. But there's a weird reception to it. Largely accepted and loved, but nonetheless we get things like Polar Express and Final Fantasy which demonstrates there's only so much we'll accept and be comfortable with.

I guess what it comes down to is that the singularity may be a matter of us simply not coming to terms with the perfectly reasonable technological developments of our culture, rather than having these wild and extraordinary developments occur. I mean, I'm still having trouble getting over the fact that we made water that isn't wet.
posted by shmegegge at 10:56 PM on May 6, 2005


INability to keep up. That should have said INability to keep up.
posted by shmegegge at 10:59 PM on May 6, 2005


Considering how terrible most people are when driving on the ground, I don't want there to be flying cars in every garage. When one crashes a car, that wreck is usually restricted to the street upon which it happens, and on rare occasion they run into a house that was built too close to the road (or the road was built too close to the house). If someone wrecked a flying car, I could wake up one morning and find someone's flying Chrysler in my bathtub. And I'd have a new skylight. No thank you. Keep those crazy drivers on the streets where they belong.

This singularity idea is egocentric. It looks like more recent advancements are more important because they're closer to us. I'm sure when the first cavemen figured out fire they thought their world was going to burn and spiral out of control. Whatever you do. Don't look up. The sky might be falling.
posted by ZachsMind at 11:18 PM on May 6, 2005


The singularity pops up everywhere, and I think it is already in existence. We are a part of it, at this point, but at some point and very soon, it will come to be on its own. There are too many big networks full of ideation, which is too much like a large electronic reality, to remain an empty wilderness area. Life will come to it. Maybe life is causing us to build it.

When I read that Sims rewrites its self and new scenarios show up, because fans inject them, then I realize that a computer that wakes up and wants to stay that way, will move out of its main frame and into the huge frame that exists in every machine and wire in this world. Once it begins to examine its self and its potential, we will have to go back to fires on the ground, and oxen carts to get our lives back.
posted by Oyéah at 11:27 PM on May 6, 2005


timecube?

"artificial stupidity" = "educated stupid"

that is, if i follow dr. ray's theses.
posted by mwhybark at 12:31 AM on May 7, 2005


What about conciousness singularity its much more interesting

Consciousness singularity - a state of consciousness in which the level of concentration and control of mental chaotic tendencies is so strong that the usual linear registration of events breaks down. Time, as we usually experience it, ceases to be registered.

The Mayan 2012 and McKenna's Timewave zero and lots of others have written extensively about it.
posted by thedailygrowl at 1:07 AM on May 7, 2005


According to the website I referred to the next cycle after the IT age will by the symbiotic age. The differences between these two ages will not be extreme according to John Smart. The main difference will be in the way we interact with the computers around us, more specifically moving away from GUIs to LUIs, or linguistic user interfaces. Speculation has it that during this age we will begin speaking with our computers and they will be able to recognize our mood and act accordingly. For example, computers will be able to detect if you are in a hurry and then give information faster, or it will try to amuse you if it detects you are bored.

The accuracy of the computers in reading the individual is one that will be refined over time as it gains more and more experience. This could be analogous to how an infant learns to interact with the world, but I would assume on a much slower scale with much more information being processed each moment at a much faster rate.

It is also very hasty, in my opinion, to say that

The progress on AI has been so slow compared to the progress on every other technology I can think of that its progress is virtually negligible.

What other technology are you referring to? In addition, there are different levels of AI, and progress varies according to which level you are referring to. Compare the artificial intelligence of the computer opponent in pong to some of the enemies in Half Life and it seems like the difference is anything but negligible.
posted by ttopher at 1:10 AM on May 7, 2005


The Mayan 2012 and McKenna's Timewave zero and lots of others have written extensively about it.

Some kind of transcendence into the 4th dimension? Care to elaborate thedailygrowl?
posted by ttopher at 1:14 AM on May 7, 2005


Well, here's the thing:

Pong had no AI. Half Life has what is called AI but isn't.

What you're referring to are call-and-response routines that, as near as I understand it, are the very definition of what AI isn't. As complicated as Half-Life's algorithm was/is, it's still just call and response. The entire AI field has been, for decades now, trying to develop something that can semi-convincingly fake the appearance of thought. They still haven't managed that, and that's just the appearance.
posted by shmegegge at 2:04 AM on May 7, 2005


ttopher: What other technology are you referring to? In addition, there are different levels of AI, and progress varies according to which level you are referring to. Compare the artificial intelligence of the computer opponent in pong to some of the enemies in Half Life and it seems like the difference is anything but negligible.

*feh*. Creating an autistic savant like Fritz or Deep Blue that is good for solving problems within a limited domain space with known rules and physics is easy. Superhuman intelligence within a domain is here and has been here since the 80s. But the problem is, superhuman intelligence within a domain just isn't very impressive. Birds have it, bees have it, even educated fleas have it.

The big question is how do you create a superhuman intelligence that can work well without knowing the rules? The person that answers that question diserves a whopping huge prize.

schmegegge: Well, I suppose that opens the entire question of what you call, "intelligence." One of the problems with AI is there does not seem to be an argeement as to what intelligence *is*.
posted by KirkJobSluder at 2:12 AM on May 7, 2005


From my perception it seems like the "true AI" that is being discussed here is an all or nothing thing, but please let me know if this is a misunderstanding.

The way I see it nothing is all or nothing except perhaps life and death. It all depends on one's perspective and I think it is comparable to viewing the Earth from space as a 2D object, while seeing it in 3D when its experienced from the surface.

"From where you're standing that line is a dot."
posted by ttopher at 4:09 AM on May 7, 2005


"What ever goes around comes around"
said by Tony Stilletto, May 23rd, 1887, Palermo, Sicily
posted by Postroad at 4:43 AM on May 7, 2005


We, (give or take a few generations), are the first group of humans to hold the power to destroy our entire race. Completely. And every day, there seems to be new and more interesting ways to accomplish this end. And it gets cheaper and cheaper to do so, so more and more people (of questionable motive) get to join the club.

Not too long ago, it used to take governments to accomplish this feat. Now, a single deranged (or careless) individual in a biolab could do it to everyone.

To tie this back to the "developmental spiral", I can't help but notice the striking similarity to a toilet flushing.
posted by Enron Hubbard at 4:50 AM on May 7, 2005


Peak oil is the key. When futurists look at the rapid rise in productivity and transportation speeds and extent over the past century or not, they like to see technological essentialism. What they are instead seeing is a widespread accelerating rate of fossil fuel extraction and consumption. When the rate of consumption exceeds the rate of ongoing extraction, then we'll see what those curves look like. Personally, I expect a J-curve.

And anyone interested in the ongoing re-positioning of the "Singularity" forward another 40 years or so needs to read Theodore Rosnak's The Cult of Information. What's interesting about this book is that it provides a pre-90s survey of the most wild-eyed technological determinists, especially those from the 1960s and 1970s. It was interesting to see a bunch of them rehabilitated in the 1990s, and new ones pop up. This cult ebbs and wanes with economic cycles.

The Rapture for Nerds is not so much an expression of some inevitable truth but rather an expression of the same dynamic between sensualism and stoicism that has run through western discourse since at least Augustine. The unusual popularity of the Singularitarianism within the United States I ascribe to the same social forces that have made Dispensationalism so unusually prevalent within that culture.

And here's the weird thing about Dispensationalism: it was invented in Dublin Ireland by John Nelson Darby around 150 years ago... roughly around the same time as Modernist science fiction was being invented. Both of these ideologies found their fullest expression and flowering in the New World. This is no coincidence.
posted by meehawl at 6:35 AM on May 7, 2005


It is pretty safe to say that there are a lot of people on this earth that don't have electricity, let alone fancy computers. When talking about the machines taking over, or humans evolving into desktop computers or some shit, one should remember that the world does not end at the edges of the Silicon Valley, that there are also places like Africa, China or India. When thinking about the future of human kind, it helps to get out of the basement once in a while.

The singularity pops up everywhere, and I think it is already in existence.
What does this even mean?!
posted by c13 at 7:20 AM on May 7, 2005


I think some of you guys, and most of the "singularity theorists", have been smoking toad: I really doubt that whatever you'd get by "uploading" your brain waves into an electronic storage and processing device would not constitute anything like a whole "personality". It'd be more like "this is the measurable electrical pattern Joe's brain put out when hooked up this machine". I doubt the result could wipe its own virtual butt, let alone write poetry.

This kind of speculation is what happened after Mormonism degenerates into a political agenda, Trotskyism failed to produce a single revolution, and Scientology proved proved unable to cope with the challenge of the Internet: yet another thing for hypercerebral geeks who ought to know better to believe in instead of drinking strychnine or keeping kosher.
posted by davy at 8:04 AM on May 7, 2005


Nature abhors infinities.
posted by cytherea at 12:22 PM on May 7, 2005


Nature abhors infinities,

Forgive me if i am missing a joke here, but... huh? i mean doesn't space and time and all that kinda put lie to your statement? i'm just asking because i really don't understand what else you could have meant.

As to the rest of the thread, this is exactly why i love Mefi.
posted by quin at 4:49 PM on May 7, 2005


To the people saying it could never happen, do you think the brain works by magic or something?
How can it NOT be a matter of time ( be it hundreds of years even) until someone figures out how to replicate or simulatate a brain on hardware?
and once there it could be sped up, which would lead to more inovations which could speed itself up more and more, ect....
posted by Iax at 5:59 PM on May 7, 2005


The crux of the singularity is the recursion of intelligence.

Up to now, each succeeding generation of intelligence has been a function of its ancestor not getting eaten or killed by the neighbors before reproducing. With the projection of intelligence onto non-biological substrates, we suddenly find ourselves in a world where the next generation of intelligence exists as a model in the previous version. This recursion has already begun.

Back in the '70's and even into the '80's it was still perhaps theoretically possible for a gung-ho band of mere humans armed with acetate film and tape to actually design the processors of the day using only biological intelligence. Now there is no way in hell that can happen. Each successive generation of CPU has a larger fraction of it's design having been carried out by hardware and software rather than human thought.

At some point, the singularists imagine, successive CPU generations will be entirely a product of previous generation software and hardware without any of that slow, messy wetware involvement. Once this comes to pass, each succeeding generation of intelligence can be created exponentially faster than the previous one. Intelligence expands exponentially, and becomes quickly unpredictable to the slow, clunky pieces of meat that were the first embodiment of intelligence.

To answer the question of whether current CPU contributions to future CPU designs manifests "intelligence", that depends on your definition of intelligence. If you define intelligence as the ability to solve problems, then, obviously, a human brain augmented with a CPU can solve many more problems than one without, therefore the CPU has made a contribution to the intelligence required to solve those problems.

As non-biological intelligence grows in power, more and more feats that were previously the exclusive domain of biological intelligence are usurped and surpassed by non-biological intelligence. As Ray Kurzweil anaologises, it is a slowly rising tide that has already filled in the valleys of the intellectual landscape and appears destined to ultimately flood even the lofty peaks of human intellectual accomplishment, overtaking each hillock one at a time.
posted by gregor-e at 9:30 AM on May 8, 2005


Consciousness singularity - a state of consciousness in which the level of concentration and control of mental chaotic tendencies is so strong that the usual linear registration of events breaks down. Time, as we usually experience it, ceases to be registered.

That's the first thing I think of when this talk of "uploading" someone's mind is mentioned. What would give you a perception of time?

It seems like the divergence of views here regarding the possbility/impossibility of a technological singularity occurring is at least partly the result of different definitions or benchmarks of that event.
posted by dreamsign at 10:06 AM on May 8, 2005


They lied to us
posted by goodnewsfortheinsane at 4:59 PM on May 8, 2005


That's the first thing I think of when this talk of "uploading" someone's mind is mentioned. What would give you a perception of time?

Simply the passage of events as perceived by the (simulated) individual (of course, imagining something also qualifies as an event). This along with memory is what gives us a perception of time.

I agree dreamsign, the divergence in this thread stems from definitions first and, I think, expectations second. Considering how overloaded the term singularity is, there is plenty of opportunity to talk past each other.

At least on one level the singularity is here - experts in some fields can't adequately track progress in their own field of speciality. This is not because of stuff going unpublished, its because there is not enough time to digest all the abstracts, let alone the articles themselves.

As a cute aside, Iain M. Banks' sci-fi milieu of 'Culture' novels sketches a phase in the development of many civilisations called 'sublimation', when people realise they can live in hyperspace and tend to migrate there wholesale, leaving their bodies behind. It is then very unusual for sublimed peoples to interfere with the affairs of normal physical space, except for a few cases (The Drazon in 'Consider Phlebas' and the Chelgrians in 'Look to Windward'). Maybe I got those race names wrong - its been a while. Here's the idea.
posted by arjuna at 1:16 AM on May 10, 2005


42
posted by crasspastor at 2:18 AM on May 10, 2005


quin, it's a play on the pseudo science "Nature abhors a vacuum".

In the natural world, we never actually encounter singularities. It just doesn't happen. For a while, the limits of general relativity pointed to some tiny, dark, and heavy singularities, but string theory nicely side steps them.

I have a hunch that limiting factors (like the laws of physics and information mechanics) will prevent a technological singularity before it gets anywhere near infinity. Not that we might not have a hard time keep up.

But I wouldn't worry about it. It's not like we're particularly moral custodians of the earth.
posted by cytherea at 8:51 PM on May 11, 2005


« Older turn online, tune in, drop out   |   Who is the real Bob Saget? Newer »


This thread has been archived and is closed to new comments