Singularity
January 21, 2008 9:35 PM   Subscribe

The Singularity Institute for Artificial Intelligence has put up a some interesting media, including a variety of talks from the Singularity Summit 2006 and 2007, about the possibilites and progress of technological development. For an overview of the issues Ray Kurzweil talks about the ideas and promises of the singularity, while Douglas Hofstadter calls for deeper exploration of the implications and hazards of coming technology.
posted by MetaMonkey (43 comments total) 22 users marked this as a favorite
 
A future that contains smarter-than-human minds is genuinely different ...

Uh huh. I remember when I used to believe that human intelligence was the driving force in shaping the world we lived in.

Good times.
posted by tkolar at 9:48 PM on January 21, 2008 [3 favorites]


while Douglas Hofstadter calls for deeper exploration of the implications and hazards of coming technology.

Considering that we're pretty much failing to engage in a deeper exploration of totally evident implications and hazards of current technology, Mr. Hofstadter is being exceptionally optimistic about that one.
posted by nanojath at 9:53 PM on January 21, 2008 [2 favorites]


I remember when I used to believe that human intelligence was the driving force in shaping the world we lived in.

Define human intelligence.
posted by empath at 10:03 PM on January 21, 2008 [1 favorite]


Define intelligence.
posted by ook at 10:09 PM on January 21, 2008 [1 favorite]


Define Defining.
posted by bigmusic at 10:11 PM on January 21, 2008 [1 favorite]


Define define.
posted by tbastian at 10:12 PM on January 21, 2008 [1 favorite]


thanks MetaMonkey. The strange thing about the singularity is that we want to prepare for it somehow but as Vinge says- our model of the world breaks down when it tries to model a future that contains entities smarter than human.

basically any of our predictions beyond 20 years are going to be wildly wrong.
posted by bhnyc at 10:20 PM on January 21, 2008


Define Singularity.

Oh...Ok, lets see, a Singularity is...

[...trumpets, angels singing, cosmic thunder...]

I HAVE ACHIEVED GODHOOD

FREE PIZZA FOR EVERYONE
posted by Avenger at 10:21 PM on January 21, 2008 [5 favorites]


I thoroughly enjoyed Bill Mckibben's 'floating head'* talk as a counterpoint to the rampant speculation of the other speakers.
"The more things change, the more they stay the same" is a saying that (I predict) will never get old.

*His talk was recorded and projected on a transparent screen.
posted by FissionChips at 10:37 PM on January 21, 2008 [1 favorite]


How will the Technological Singularity fare against the coming Stupidity Singularity? Think about it, another fifty years of reality teevee, Fox news, dumb internet crap, crumbling infrastructure, idiotic politicians and genetically modified junk food supply, humans are going to be incapable of even maintaining these so-called "smart" machines.
posted by jefbla at 10:48 PM on January 21, 2008


the point is that we won't need to maintain them.
posted by empath at 10:51 PM on January 21, 2008


or rather, they won't need us to maintain themselves.
posted by empath at 10:51 PM on January 21, 2008


Define human intelligence.

Intelligence is the tool we use to solve problems requiring abstract thought.

It is, at the end of the day, a tool and that's all. Making it bigger and better and stronger will lead to better toys, but it won't solve centuries of racial tensions or the human population expanding beyond what the ecosystem can support, or provide a meaningful purpose to existence.

Super-intelligence is a dead end, sought only by people who have done well in this world by using their brain and who can only see more of the same ahead.

(this isn't to knock AI research in general, which I think will eventually provide us all with excellent toys)
posted by tkolar at 11:00 PM on January 21, 2008 [3 favorites]


The singularity anticipated with regards to intelligence is pretty nuts. For one thing, Moore's Law stopped working several years ago. For another, we don't have any real idea how much processing power the brain has (ya, we can count the number of neurons that fire every second, but I don't think that is the same thing). Finally, context is everything, and animal life has millions of years of that. Artificial intelligence might happen, one day, but it appears to be a long way off yet - the last decade of lack of progress demonstrates that well enough, I think.

The much more interesting type of singularity, to me, is in manufacturing cost. We just had a post about machines that can build themselves. Manufacturing overhead (I mean the cost over and above raw materials) is rapidly declining, and the implications of that are being felt in the near term.
posted by Chuckles at 11:01 PM on January 21, 2008 [1 favorite]


Dude, I'm watching Terminator: The Sarah Conner Chronicles and they mentioned the Singularity, just now as I'm reading this. I think my TiVo is self-aware!

Plus, it knows what shows I like and records them for me!
posted by mrnutty at 11:06 PM on January 21, 2008 [1 favorite]


I think that my problem with Singularity speculation is that it's based on a model of continued exponential growth. Not only that, it extrapolates from observed exponential growth in the rate of miniaturization to a corresponding increase in intelligence. I don't see any reason to believe that the latter follows from the former.

Think about it this way. A couple of days ago we had this post about universal computering devices. In a formal sense, the computers we have now are just as powerful as the ones we have had for years. The algorithm for an intelligent computer has been implementable as long as computers have existed. The problem is not that the machines have not been fast enough, and that we need to hit some magical speed to suddenly have intelligence. The problem is that the theory is not in place yet; the problem of strong AI -- creating human level intelligence -- is ill-posed, and is not being actively worked on in a way that you can plot progress as a dependent variable against time.

I'll just cap this with a 2005 AI Magazine article by Nils Nilson: Human-level artificial intelligence? Be serious!. It's a good overview of the work that remains to be done on constructing human-level AI.
posted by agent at 11:43 PM on January 21, 2008


but it won't solve centuries of racial tensions or the human population expanding beyond what the ecosystem can support, or provide a meaningful purpose to existence.

Maybe the last one is hard, but if people or computers all got to be a few orders of magnitude smarter I think they'd be smart enough to drop stupid racial identities and hatreds and to put a rubber on their willy.
posted by TheOnlyCoolTim at 11:53 PM on January 21, 2008


Will the emergent intelligence be allowed a MeFi account?
posted by sien at 11:53 PM on January 21, 2008


Dude, I'm watching Terminator: The Sarah Conner Chronicles and they mentioned the Singularity

This should be emailed to every single science fiction author out there who might be tempted to start a new magnum opus of singularity gobbledegook.
posted by Artw at 12:00 AM on January 22, 2008


Will the emergent intelligence be allowed a MeFi account?

No, it will need $5, just like everybody else.
posted by Chuckles at 12:01 AM on January 22, 2008


The power of hardware means nothing when software sucks.

I work with some very, very competent software developers. Software sucks. No matter how far up you go. It's just that the higher up you go the more they know it sucks.

I don't think singularitists (I made that word up) truly realize that there is not some magic that occurs when you have very fast processors or very fast parallel computing environments. Humans need to be able to comprehend the world and construct software around these horrible, intractable problems before they get solved.

i.e., we have a lot of work to do. Human work. The numbers will not magically do it for us.
posted by blacklite at 12:06 AM on January 22, 2008


The problem is not that the machines have not been fast enough, and that we need to hit some magical speed to suddenly have intelligence.

Exactly. It's not like if some guy across the street from Watson & Crick had a Magiputer™ he would have suddenly known about DNA. He would just have a really fast way to play Quake or something, which he would have.
posted by blacklite at 12:09 AM on January 22, 2008


Did anyone else pick up on the immense tension between Hofstadter and Kurzweil?
posted by farishta at 12:26 AM on January 22, 2008


This is a nicely produced website. The fact that they bothered to make transcripts of almost everything is particularly cool.
posted by teleskiving at 2:17 AM on January 22, 2008


Singularists are the optimistic twins of the Peak Oil apocalyptists. One assumes never ending exponential growth which will inevitably lead to rainbows and supersmart self-conscious AI, the others focus exclusively on the limits of growth and how horrible it will be when we hit the wall.

I just wonder which will come first.
posted by sophist at 2:54 AM on January 22, 2008 [1 favorite]


The Singularity is bunk. AI has been ten years away for fifty years now. I see no reason to believe it won't remain ten years away indefinitely.
posted by flabdablet at 2:56 AM on January 22, 2008


Faster computing is not completely useless for generating better ideas. For example, something like the prime number theorem took a well-funded super-genius like Gauss to spot with pen and paper, but a prime-generating Python script and some plotting utilities would provide a strong visual clue to the pattern to anyone.

Perhaps the Singularitarian (?) position is that millions of people experimenting with powerful tools will be more likely to hit on the "intelligence algorithm" than a handful working with pen and paper and pure theory.
posted by hoverboards don't work on water at 3:44 AM on January 22, 2008


Is the Singularity about to happen? No.

Will it ever happen? Yes, and soon enough that we are beginning to feel it coming. Biological evolution ends with the human race creating its non-biological replacement.
posted by tgyg at 4:24 AM on January 22, 2008


The Singularity happened centuries ago. I won. The end guy was hard.

You're all just ghostly figments in the cluttered corners of my Jupiter Brain.

Yeah, your simulation sucks. Sorry about that. You'll just have to trust me. I promise that it's much better on the other side of the firewall. Peace, love and rivers of chocolate and frolicking winsome lesbians and all that. Good stuff.

No, you can't have any. You'll just get it all dirty and corrupt the sim.
posted by loquacious at 5:15 AM on January 22, 2008 [1 favorite]


My story is the Singularity happened around 1989 and I'm sticking to it.
posted by localroger at 5:29 AM on January 22, 2008


As fun as AI is.... Why are people not like finding a way to make a renewable source of energy/something free and better than gas? Haven't these people watched Sci-Fi horror movies?!?!?!?! The machines take over and Humanity lives in some post-apocalyptic nightmare where I gotta call water aqua or something lame like that. And everything turns into a dessert too for no good reason.... AND you can forget about free pizza fridays!
posted by Mastercheddaar at 6:31 AM on January 22, 2008



They still haven't made a pill to make my farts smell better.

First things first.
posted by Bathtub Bobsled at 6:40 AM on January 22, 2008


Define Singularity.

1: A belief central to a quasi-religious cult of personality around prophets who make bold futuristic claims in fundamental contradiction with known theories of economics, culture and technology.
posted by KirkJobSluder at 6:41 AM on January 22, 2008


The problem with singularity predictions is that technology is ecologically dependent on culture and economics. "Disuptive" technologies are either marginalized, appropriated by the dominant economic systems, or killed in the cradle.
posted by KirkJobSluder at 6:49 AM on January 22, 2008


You know, the social benefits of intelligence aren't questioned - if your IQ is higher you get paid well, have better education, live longer and healthier, etc.

This is true up to about an IQ of 150. Above that, all that increasing IQ buys you is a sharply higher incidence mental instability and membership into some seriously geeky clubs.

Based on my own personal experience with very-high-IQ people, I assume that when the Singularity occurs, we will see virtual gatherings of ultra-intelligent machines, where D&D will be played at 200 GHz and there will be much complaining about how women would rather date a mean jerk than a Singularly Intelligent AI device.
posted by ikkyu2 at 7:13 AM on January 22, 2008 [5 favorites]


Biological evolution ends with the human race creating its non-biological replacement.

Biology... dissing itself since 1989.

They still haven't made a pill to make my farts smell better.


Pssh, speak for yourself, my farts already smell awesome. -sniff- Ahhh! -poot- -sniff- Ahhhh! Smells like a strange loop!
posted by Laugh_track at 7:15 AM on January 22, 2008


AI will have to be evolved and grown out of a primordial code soup of fundamental qbits. The problem, of course, will be that when this works we will know nothing more about how to design AI systems and neither will the AI.
posted by effwerd at 7:36 AM on January 22, 2008


tgyg: Will it ever happen? Yes, and soon enough that we are beginning to feel it coming. Biological evolution ends with the human race creating its non-biological replacement.

And just as with young earth creationists, the best response to this kind of rubbish is to point and laugh.
posted by KirkJobSluder at 7:45 AM on January 22, 2008


AI ain't getting here any time soon. If you want more scientific progress, the best idea is : parallelization of humans.

We could be working on mice right now, and eventually move to monkeys. But maybe, if we're lucky, we might learn how to let people benifit from parallelization about the time Kurzwell's promises of true AI have failed.

Plus the project has enormous side benefits in terms of advances in brain implants and general knowledge of how the brain works.
posted by jeffburdges at 8:04 AM on January 22, 2008


I disagree with people who are certain that AI won't make any progress in the near future.

I think it's clear to most people (outside of MIT) that the current approaches have played out, and there are a lot of young smart people trying everything under the sun right now. If ever a field was ripe for a breakthrough, AI is it.



Pssh, speak for yourself, my farts already smell awesome.

"No man dislikes the smell of his own farts" -- Icelandic proverb
posted by tkolar at 8:52 AM on January 22, 2008 [1 favorite]


And everything turns into a dessert too for no good reason....

Either you're confused about the spelling of "desert", or you've just suggested the most awesome future ever imagined by man.
posted by Parasite Unseen at 9:27 AM on January 22, 2008


With Death by Chocolate.
posted by weapons-grade pandemonium at 9:34 AM on January 22, 2008


localroger : My story is the Singularity happened around 1989 and I'm sticking to it.

My theory is that localroger is correct. Mainly because it would explain the artificial intelligence I saw the other week cruising around in an old Ford Escort, listening to Guns 'n Roses and Mötley Crüe, and going on and on about beating Super Mario Brothers.

I kept saying that it was living in the past, but apparently it was stuck in a recursive loop.

So I borrowed $100, saying that I was going to invest it in Minidiscs.

Sucker.
posted by quin at 1:42 PM on January 22, 2008


« Older Adequacy + Catastophe = Efficiency?   |   Look out below...! Newer »


This thread has been archived and is closed to new comments