Kamp Kurzweil
June 14, 2010 8:11 AM   Subscribe

The Singularity University is here. Founded by Ray Kurzweil and Peter Diamandis, Singularity University aims to pave the way to our posthuman future. Can't afford the $15,000-$25,000 "tuition"? Buy a singularitee, instead!
posted by adamdschneider (88 comments total) 15 users marked this as a favorite
 
Computer Jesus Saves

CTRL†S
posted by sonic meat machine at 8:13 AM on June 14, 2010 [9 favorites]


I can't help reading the "I [heart] S H" t-shirt as "SHIT" -- I guess it's my pathetic ol' meat brain acting up on me? Also, and I know other have said it before but it always bears repeating, these guys just remind me of Christian youth kids pining for the Rapture.
posted by aught at 8:17 AM on June 14, 2010


I need this sentiment (but with a better image) on a tshirt, STAT.
posted by DU at 8:17 AM on June 14, 2010


Finally a religion atheists can get behind.
posted by Solon and Thanks at 8:25 AM on June 14, 2010 [2 favorites]


Buy a t-shirt!

Show your meatsuit that you care!
posted by Pronoiac at 8:26 AM on June 14, 2010 [1 favorite]


The first rule of singularity university is that you are already too late.
posted by srboisvert at 8:31 AM on June 14, 2010 [2 favorites]


Finally, a university where I can study a future that I have, by definition, absolutely no way of predicting, understanding, or comprehending! Can't wait for those Under-Grey-Goo Basket Weaving classes to start!
posted by cthuljew at 8:32 AM on June 14, 2010


I really dislike the smug assertation that Atheists do have a faith, it's just science.

Science isn't something you have faith in, after all, it's a system for knowing the truth of the natural world.

Singularity fetishism actually strikes me as something approaching religion for Atheists.
posted by codacorolla at 8:44 AM on June 14, 2010


Are they accredited?
posted by grubi at 8:46 AM on June 14, 2010 [3 favorites]


My mom used to work for Ray Kurzweil at Kurzweil Music Systems, back in the 1980s. He once stole a joke from her, used in in a talk, and didn't give her credit. True story.

I want no part of Ray Kurzweil and his silly nerd rapture.
posted by bondcliff at 8:48 AM on June 14, 2010 [1 favorite]


He stole my brother's basketball too.
posted by dobie at 8:59 AM on June 14, 2010


There's no way you'll know you earned a degree. Or at least this is what the Registrar's Office will keep telling you each semester.
posted by Blazecock Pileon at 9:00 AM on June 14, 2010


Damn: I thought the Singularity meant I didn't have to go to University, just wait.
posted by Phanx at 9:01 AM on June 14, 2010


I want no part of Ray Kurzweil and his silly nerd rapture.

Me neither. Unless it really happens, in which case I'm totally there.
posted by DU at 9:02 AM on June 14, 2010 [2 favorites]


Although the concept of a technological singularity is believable insofar as a sufficiently advanced technology would be capable of automating the process of creating further technological advances, at which point we get a positive feedback cycle of progress, I personally think that we are much farther from that point than people such as Kurzweil believe. His bubbly enthusiasm is somewhat as if Archimedes had predicted that the javelin will soon be made obsolete by the advent of nuclear weapons.
posted by grizzled at 9:02 AM on June 14, 2010 [1 favorite]


The university's web page and their mission statement:
Singularity University (SU) is an interdisciplinary university whose mission is to assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies in order to address humanity’s grand challenges. With the support of a broad range of leaders in academia, business and government, SU hopes to stimulate groundbreaking, disruptive thinking and solutions aimed at solving some of the planet’s most pressing challenges. SU is based at the NASA Ames campus in Silicon Valley.


There's also this gem:

One Percent Club

Those students who start new companies (coming out of SU or SU relations) are encouraged to donate 1% (non-dilutive) or 5% (of Founding Equity) of their company to SU. This will help build the University’s endowment.

The companies, their website and the SU Alumni Founders will be proudly listed on the SU website in the coveted 1% site.

We hope that companies in the 1% Club will be the major successes of the next decade.


It looks to me like they can't make up their mind whether they're a think tank, a fancy executive retreat thingy, a graduate program, or an advocacy group.

If they are serious, they are doing a bad job of showing what a prospective payee gets out of the program. As someone practicing in a field they seem interested in, I don't understand why I should consider any of their programs. Inspiration? Networking? I get plenty of that in my day job and work related social interactions, and a huge bonus I get paid to do it, rather than having to pay for the privilege. Maybe they do have something, but I am not able to ferret out what that "something" is from their messaging.
posted by forforf at 9:03 AM on June 14, 2010 [1 favorite]


"Science isn't something you have faith in, after all, it's a system for knowinglearning the truth of the natural world"

Science is about learning, not knowing.
posted by Eideteker at 9:04 AM on June 14, 2010


Me neither. Unless it really happens, in which case I'm totally there.

Well, yeah. I mean, if it's a choice between my mom's honor and having my brain uploaded into a Mac mini then my mom totally loses.

But until then, that wannabe-immortal motherfucker stole a joke from my moms!
posted by bondcliff at 9:04 AM on June 14, 2010 [2 favorites]


I've been reading Accelerando for a week now and I still don't know what the singularity is.
posted by Eideteker at 9:05 AM on June 14, 2010


I think it'll be fun being a crochety old man complaining about the sentient machine slavers, the kids with their consciousness uploads and the fucking von neumann machines all over my fucking synth-org lawn.
posted by I Foody at 9:12 AM on June 14, 2010


Solon and Thanks: "Finally a religion atheists can get behind."

"If God did not exist, it would be necessary to invent him" - Voltaire
posted by Bonzai at 9:15 AM on June 14, 2010 [2 favorites]


Wait, this is still a thing? I thought this went away in, like, 1997.
posted by Ratio at 9:17 AM on June 14, 2010


Well, I think their football team will really be on the cutting edge of performance enhancement. People will complain for a while, but once they start racking up BCS championships, all the other big schools will start to follow suit. I really think the Singularity is going to be like the second coming of the West-Coast offense.
posted by Shohn at 9:20 AM on June 14, 2010 [2 favorites]


I long for the day when the Wolframites declare holy war upon the Singularitarians.
posted by benzenedream at 9:21 AM on June 14, 2010 [2 favorites]


The idea is somewhat plausible, and as a Sci-fi devotee and atheist, it is pretty appealling.

Sadly the implementation is waaaaay too Scientology-esque.
posted by T.D. Strange at 9:32 AM on June 14, 2010


The Singularity has altered somewhat from its original meaning. At first, it was a mathematical metaphor — a division by zero, an inflection point, etc. Does it come back down? Does it start back at negative infinity? We don't know. The Singularity was envisioned as a fuzzily-defined point that, after which, you cannot predict. That is all. In some sense, we've always had one ahead of us, but for centuries the concept was so far away that we couldn't even see it, much less make vague noises about it.

By the time the whole "flying car" thing got started, we had this vision of a future which would be different, but it was a future that was just More Like What We Have Now. Perhaps a little bit faster and more convenient. The Jetsons — has anything in that family dynamic changed? No. The wife is at home. The man goes to work, then returns and complains that he had to press The Button seven times today. The teen is demanding. The child is excitable and bratty. The dog continues to be a dog.

The vision of the Singularity is that, after it, things will be different. The marriage will be between a woman, a man who used to be a woman, and a AI. They have an opt-in renewable marriage contract for the next five months. The AI is retired. The man stays home and is raising his children as part of his fourth dissertation in family dynamics. The wife is in coldsleep, having a cure for a rare genetic disorder built to order, and the other two miss her. The teenager was actually the sister of one of the marriage partners, but she has opted for renewal and has been age regressed. She's having a hard time balancing the hormone implants. The child is smarter than the rest of the family put together, thanks to some fantastic genetic selections and a biocompatible implant into which his brain has grown. The dog is semi-autonomous, just a state agent checking in from time to time to make sure the family is functioning well.

Only my little jokey idea of the Jetsons is totally wrong because I'm still trying to extrapolate from my limited experience. I'm like a single cell wondering what it would be like if we somehow made the leap to being multicellular organisms. Could be good. Could be bad. There's no way that I would predict that I'd be a cell on the toe, churning out skin for a callus, or sitting in the eye, waiting for a photon to go by. I'd be envisioning swimming around in my unicellular paradigm, only I'm totally kicking ass doing the same old things, like eating other unicellular organisms and excreting waste products.

When (more like if) the Singularity hits, it'll be weird and most likely terrifying from our standard human mindset, but above all, it will be not predictable because it is all of the slope of curve from future shock, compressed into a point.

That it has morphed into the Self-Accelerating Decomposition Temperature of beneficial technological progress, wherein Cyberjesus will arrive and, with his flaming GNU-license, destroy all non-open-source software and bush-robot/angels will shuck our consciousness out of our skulls and into a glorious technical utopia which is, above all things, pleasant, is a consequence of people seizing upon basic optimism in technology and wedding it to the original idea.

If you want a nice counter-example in science fiction, The Borg fit just about every criteria you might want for a Singularity event, but the "reality" of life afterwards is hardly appealing.
posted by adipocere at 9:33 AM on June 14, 2010 [16 favorites]


grizzled: Although the concept of a technological singularity is believable insofar as a sufficiently advanced technology would be capable of automating the process of creating further technological advances, at which point we get a positive feedback cycle of progress, I personally think that we are much farther from that point than people such as Kurzweil believe.

Singularity skepticism is grounded on the notion that Kurzweil and Vigne fundamentally misunderstand the nature of technology. Technology is a socio-economic phenomenon. The reason why we're stuck in the United States with an 80-year-old transportation system isn't because smart people have failed to design better, it's because the costs of scrapping that system for better are high, and there's still huge profits to be made maintaining the old system. Truly disruptive technologies are strangled stillborn in their cribs.

I've not yet seen a convincing argument that such automated design systems will become uncoupled from the socio-economic ties that support and limit them.
posted by KirkJobSluder at 9:42 AM on June 14, 2010 [8 favorites]


The graphic in last years Wired story on this sums it up nicely
posted by Dr. Twist at 9:52 AM on June 14, 2010 [6 favorites]


The only thing I really have to contribute to the discussion on the Singularity is this: supposedly it is a point beyond which the world as we know it will take a shape that we cannot predict at this time. Most of the speculation I see doesn't fit that description at all, and in fact is simple prediction. The only thing I have ever been able to think of that might do something like this is the ability to quickly, simply and easily change our brains, to the point where we can edit emotions, thoughts, predilections, memories, etc. I am unable to truly grasp what it would be like to live in a world where I could effectively choose to stop being me and become someone else. That is all.
posted by adamdschneider at 9:54 AM on June 14, 2010


Man, this might be Activision's craziest viral marketing campaign yet.
posted by kmz at 9:57 AM on June 14, 2010 [1 favorite]


I would love to know why this should be any more credible than any other fundamentalist, evangelical institution in North America. Not to put too fine a point on it, but both institutions are rooted in a belief that we will be saved, destroyed or both by some agent that shows no evidence of actually existing. And (whether you agree with them or not) at least religious institutions make some claim towards moral guidance, and Singularitariariariariarians (which, note, you should pronounce as though you were yodelling) don't even have that. So what's the point, aside from styling yourself to be some sort of nerd prophet?

Pro tip: every time your read "singularity", replace it with "god", "Iaweh" or "Flying Spaghetti Monster"; it's instructive that the only thing that separates Singularity advocates from any other credulous kooks is a single, roughly-drawn graph.
posted by mhoye at 10:02 AM on June 14, 2010


Pro tip: I can use "science" to make "televisions." I can use "religion" to make ... what, exactly? There's a reason our science fiction future doesn't have Dick Tracy watches shaped like Ouija boards.

The appeal of the Singularity is that it is at least slightly related to something with reproducible results. In constrast, our advances in prayer technology haven't been all that astounding.
posted by adipocere at 10:15 AM on June 14, 2010 [1 favorite]


My favorite commentary on the singularity.
posted by longdaysjourney at 10:19 AM on June 14, 2010 [9 favorites]


Pro tip: I can use "science" to make "televisions." I can use "religion" to make ... what, exactly?

Singularity University!
posted by mhoye at 10:32 AM on June 14, 2010 [3 favorites]


Imagine everyone's shock when, having achieved average human intelligence, efforts to surpass it result in a series of computers that can only be installed in parents' basements, are obsessed with the works of Joss Whedon and Gene Roddenberry, and cannot talk to women.
posted by condour75 at 10:37 AM on June 14, 2010 [2 favorites]


"I can use "religion" to make ... what, exactly?"

To make people make televisions. Or war.

Religion is still a better method of controlling people and massing their output/abilities than anything science has invented yet. Religion working WITH technology, a la FoxNews? Potentially devastating.
posted by Eideteker at 10:43 AM on June 14, 2010 [1 favorite]


I, for one, would happily welcome the arrival of The Jocularity.
posted by chavenet at 11:19 AM on June 14, 2010


Any sports teams? If not, what's the point of founding a university?
posted by grubi at 11:21 AM on June 14, 2010


Any sports teams? If not, what's the point of founding a university?

The fighting ...
* self-replicating units!
* uploaded minds!
* AI overlords!
* transhumans!
posted by KirkJobSluder at 11:25 AM on June 14, 2010


sw/oc
posted by Eideteker at 11:41 AM on June 14, 2010


Pro tip: I can use "science" to make "televisions."

Here's another pro tip for ya: "Don't confuse science with industry."
posted by aught at 11:52 AM on June 14, 2010 [2 favorites]


My mom used to work for Ray Kurzweil at Kurzweil Music Systems, back in the 1980s. He once stole a joke from her, used in in a talk, and didn't give her credit. True story.

Ray Kurzweil stole my heart.
posted by mecran01 at 11:56 AM on June 14, 2010


The whole singularity thing comes across to me like one of those retarded rave utopia manifestos that used to get posted to alt.raves in the 90s.

See the Hedonistic Imperative.
posted by empath at 11:59 AM on June 14, 2010


Ray Kurzweil stole my heart.

He ate my heart, then he ate my brain.

That boy is a monster.
posted by empath at 11:59 AM on June 14, 2010 [1 favorite]


Ray Kurzweil stole my heart.

The real secret to his immortality.
posted by bondcliff at 12:02 PM on June 14, 2010


...boasts that he intends to live for hundreds of years and resurrect the dead, including his own father
And sell him a t-shirt.
posted by Smedleyman at 12:22 PM on June 14, 2010 [1 favorite]


Battle Angel Alita has been stuck in this giant karate-tournament story for the past couple of years. The great thing about it is he's throwing together every transhumanist, post-singularity concept of the future of humanity into the same arena and having them fight elaborate battles.

My favorite part is when the nanomachine grey goo on Venus sends it's representative - a giant parody of biological life with a massive wang cannon.
posted by heathkit at 12:24 PM on June 14, 2010


Metafilter: a giant parody of biological life with a massive wang cannon
posted by KirkJobSluder at 12:29 PM on June 14, 2010


Any sports teams? If not, what's the point of founding a university?

The fighting ...
* self-replicating units!
* uploaded minds!
* AI overlords!
* transhumans!


Sounds like collegiate athletics to me!
posted by grubi at 12:31 PM on June 14, 2010


Considering the fact that Mr Kurzweil has a German name, a musical background, an obsession with machine intelligence and grandiose metaphysical ideas, I can only conclude that he is a character in a Phillip Dick novel. I bet he's building a life-size robotic replica of Abraham Lincoln to serve as the president of Singularity University as we speak.
posted by Dr Dracator at 12:47 PM on June 14, 2010


I quite like that Transhuman Evolution shirt. But my eyes haven't evolved to tolerate that particular shade of yellow.
posted by Foosnark at 12:56 PM on June 14, 2010


I can use "science" to make "televisions." I can use "religion" to make ... what, exactly?

Mystique? To use your example: the idea of showing someone visions of all the far-flung kingdoms of the world used to be the kind of thing reserved for angels and devils, not for anyone with a cheap gizmo in their pocket and an account on YouTube.

We'll know if/when The Singularity moves from religion to because people will stop waxing poetic about eternal transcendent consciousness and start bitching about how the thousands of games they played yesterday were too repetitive.
posted by roystgnr at 1:27 PM on June 14, 2010


"...moves from religion to reality because..." is what that was supposed to say, before a compile finished sooner than expected and my editing process got interrupted for an hour.

Hmm... "work" now often refers to a process of typing fun things, and I'm bitching about the fact that my computer works with me too quickly to leave me enough time for typing other fun things. Not quite Singularity material, but a very small step in the right direction.
posted by roystgnr at 1:32 PM on June 14, 2010 [1 favorite]


I don't believe in any singularity within the same timeframe as Kurzweil & co. because :

(a) I don't believe that building strong AI will prove nearly as easy as they imagine, and

(b) I expect we'll instead see massive advances carried out by parallelizing humans using neural implants.

In other words, the first entities that deeply "transcend" the human condition will be a large collective of small children whose brains are networked together, but they'll grow up ultimately mostly human, just unimaginably intelligent as a collective.

In particular, I'll wager we'll develop these super intelligent collectives well before non-cancerous solutions to telomere shortening. Instead, we'll more likely see neural stem cell injections that usefully increase neural plasticity, say making language learning easier, thus letting people enjoy more of their not ridiculously long lives.
posted by jeffburdges at 4:24 PM on June 14, 2010


That said, why not conjecture what uploading your brain might look like? Initially, we'll most likely just train an AI based upon creative outputs, which doesn't exactly pass most people's criteria for immortality, especially given the AI won't demonstrate many human faculties.

We might later develop methods for reading significantly more data from the brain, but we're most likely still restricted by the data your brain considers, which all looks like "Get Off My Lawn" if you undergo the process once your old.

To me, our wisest but weakest route to immortality would be authoring tools that help old people record their lives and perceptions, presumably by increasing neural plasticity and training them for basically producing literature.
posted by jeffburdges at 4:24 PM on June 14, 2010


I read Kurzweil's Fantastic Voyage just to see if there might be some goodness needle in its haystack. The most amazing thing in there is the man takes 400 nutritional supplement pills daily. That 400 day supply Centrum bottle that weighs a pound? He swallows the equivalent to that every single day.
posted by bukvich at 4:53 PM on June 14, 2010


jeffburdges:
In particular, I'll wager we'll develop these super intelligent collectives well before non-cancerous solutions to telomere shortening. Instead, we'll more likely see neural stem cell injections that usefully increase neural plasticity, say making language learning easier, thus letting people enjoy more of their not ridiculously long lives.
Reminds me of Blindsight, where they had solved the cancer problem but not the wisdom problem. Which is likely to be the same problem for our super-intelligent singualarized post-humans.

I propose the anti-singularity: the Universality.
posted by psyche7 at 5:00 PM on June 14, 2010


And to indulge in an esprit d'escalier, has anyone since the Rabbi Jesus fully realized their humanity? Isn't it a little ironic to be talking about post-human when we are really pre-human?
posted by psyche7 at 5:11 PM on June 14, 2010 [1 favorite]


Ray Kurzweil is not crazy, and he's not a cultist. He's an out there genius who had a lot to do with inventing reading systems for the blind and starting up one of the most exciting music technology companies of its time. He's moved on since then, but he sure doesn't deserve the thrashing he's getting here on the blue. I have yet to read a single entry here that understands the Singularity scenario that Kurzweil proposes.

In a nutshell, Kurzweil is saying that the intersection of technologies (AI/Cognitive Science, Nanotech, Robotics, Genomics/Proteomics) happening at exponential rates, and that soon we will be merging with technology in ways that will not be predictable, because when the Singularity evolves, technology will be informing itself. (Kurzweil gives approximate time frames, but says it's only his best guess - around 2050-2060. Even Stephen Hawking says we will have to merge with machines, to keep up with them.

Sure, some of this can get carried away, but Kurzweil is not a crackpot; I'm betting that paying heed to what he says will prepare us all for what's coming.
posted by Vibrissae at 5:12 PM on June 14, 2010


It's helpful to remember here that Einstein was wrong, with prejudice, when it came to Lemaitre's work. Which just goes to show that very smart people who are not crackpots can come up with bad ideas.
posted by KirkJobSluder at 5:28 PM on June 14, 2010


At last, the Singularity has successfully completed its transition from mere crackpot idea to outright scam.
posted by moss at 7:04 PM on June 14, 2010 [1 favorite]


Vibrissae, I think everyone understands the concept of Kurzweil's singularity. It's not exactly a difficult concept. It's just not a plausible concept. The belief in singularity is essentially the Rapture for geeks.
posted by sonic meat machine at 8:36 PM on June 14, 2010


The VICE interview.
posted by Jer1h at 8:52 PM on June 14, 2010



I think everyone understands the concept of Kurzweil's singularity. It's not exactly a difficult concept. It's just not a plausible concept. The belief in singularity is essentially the Rapture for geeks.


What are the assumptions behind your statement? You say "not plausible". What I want to know is how anyone here (including myself) is competent to make a judgment - that they can substantially back up - about why technology will not begin to inform itself at some point.

For instance, what happens when we start to get serious about proteomic control and merge that will nanotech and robotics? I don't know what scenarios this will present, but I'm pretty sure it will happen. What do you imagine will happen as these (and other) technologies converge? Do you really believe that the intersection of technologies will continue to produce only linear change? Predictable change? I'm willing to make a long bet that you're wrong.

"Rapture for geeks"?

Why the put down, by association? Just because some geeks have taken what Kurzweil posits and turned it into something else, doesn't mean that there is no validity to what Kurzweil is proposing. Look, I'm not religious at all, but if one reads really well done hermeneutical studies of the Parables of Jesus Christ, one finds that thet are very thought provoking, and can even prove valuable to some.

What I'm trying to say is don't confuse the messenger with those who misinterpret the message. I see lots of misinterpretation here - as well as writing off the core of an idea that comes from a mind that few, if any, here, can hold a candle to. And, that's not hagiography, it's a simple fact.
posted by Vibrissae at 11:16 PM on June 14, 2010




L. Ray Kurzweil, founder of Singularitology.

If the 400 pill a day thing is true, I'd say schizophrenia has probably claimed another one.

Too bad!
posted by jamjam at 12:44 AM on June 15, 2010


jamjam I looked it up. I exaggerated only a little.

Fantastic Voyage, Live Long Enough to Live Forever, Ray Kurzweil and Terry Grossman, M.D., 2004, Rodale. p.141.

"I take about 250 pills of nutritionals a day. Once a week I go to WholeHealth New England, a complementary medicine clinic run by Dr. Glenn Rothfield, where I spend the day. At this clinic I take a half-dozen intraveneous therapies--basically nutritionals delivered directly into my bloodstream, thereby bypassing my GI tract."

If you are ill, sick, or diseased modern medicine is miraculous. My own opinion is that medicine has very little to offer healthy people functioning at 70-100% of their potential and who want to purchase a boost. Kurzweil is a very smart guy but I do not have a clue why he is so fearful of dying. Deep inside his mind I suppose he knows it's quackery.
posted by bukvich at 5:06 AM on June 15, 2010


Vibrissae: You say "not plausible". What I want to know is how anyone here (including myself) is competent to make a judgment - that they can substantially back up - about why technology will not begin to inform itself at some point.

Well, on the other side you have the work of Everett Rogers and the work of thousands after him which has generally found that adoption of technology is highly dependent on cultural norms. Throw in Thomas Landauer's research that revolutionary advances in computing technology (which did happen in the 70s and 80s) did not revolutionize industrial productivity except in those cases where industries laid off a large chunk of their workforce due to automation.

"Technology begin to inform itself?" Well what does that even mean? There's a reasonably good argument to be made that co-evolution of the human grip and brain coincided with developing tool use. David Kahn's Codebreakers documents how signal intelligence and theory leapfrogged on each other from WWI to WWII.

For instance, what happens when we start to get serious about proteomic control and merge that will nanotech and robotics?

Well, it doesn't help much that Drexler in his argument for grey goo made a pretty boneheaded mistake that betrayed a freshman biochemistry understanding of photosynthesis. Many of the claims made about nanotechnology don't make a lick of sense when you start thinking about them at the level of the chemistry involved.

And on top of that, the whole thing is based on a claim that observed exponential factors will continue to behave exponentially for the foreseeable future. That's a ton of hope which is unwarranted in most cases. Why, for example, is Australia not covered neck-deep in rabbits right now.

What I'm trying to say is don't confuse the messenger with those who misinterpret the message. I see lots of misinterpretation here - as well as writing off the core of an idea that comes from a mind that few, if any, here, can hold a candle to. And, that's not hagiography, it's a simple fact.

How is this not hagiography? Kurzweil (and other singularity advocates) make claims regarding the relationship between technology and culture that are not strongly supported by the mounds of scientific research from people who actually study the relationship between technology and culture. Drexler made claims about the capabilities of machines in competition with biological organisms that are not strongly supported by the mounds of scientific research on how organisms actually utilize energy and resources at the nanometer level.

It doesn't mean they are not brilliant, Einstein, Hoyle, and Dawkins count as very smart people who had some very bad theories. But to say that we can't, you know, pull up Google Scholar and point to cases where well-designed technology failed because of cultural, economic, or social factors because Kurzweil is a fucking genius is pseudo-scientific hagiography.
posted by KirkJobSluder at 5:45 AM on June 15, 2010 [1 favorite]


Vibrissae: if you want a non-blowoff answer, you can make a good case for skepticism a couple different ways. One is to point out that sigmoidal and exponential curves look almost identical right up to the point until they don't.

For many that consideration is enough to justify heavy skepticism. Another source of skepticism is belief about the growth order of the search space for new technologies; does it grow more like 2^n, n!, 2^2^n, 2^(n!), or (2^n)!? I can sketch out not-implausible justifications for each of those growth orders, but assuming just exponential growth in "brainpower" only the first option seems likely to produce runaway advancement without further assumptions (eg: without assuming that the current exponential growth in processing power will go super-exponential past some point).

A third source of skepticism is the long history of disappointment in math and science. The speed of light isn't just a good idea, it's the law; the uncertainty principle is direct consequence of the underlying math, not our clumsiness; the halting problem leaves a great deal of interesting questions in theoretical computer science unanswerable; godelian incompleteness severely constrains what ambitious are realistic for a formal system; etc.; optimism is a wonderful thing, but it's not unreasonable to expect high hopes to be disappointed due to as-yet undiscovered theoretical principles.

An additional source of skepticism is the general lack of experts in non-computer-related fields amongst the singularitarians. This isn't perhaps robust enough for some people, but perhaps skepticism of a hypothesis summarizable as "if computers continue to work the way I know they do, and other fields work like I think they do, then certain outcomes will come about" is warranted when experts in the non-computer fields aren't arriving at the same conclusion. It's possible those experts don't really get what will happen with further advances in computation, but it's also possible -- and arguably much more likely -- that their domain expertise and firsthand experience with existing uses of computational methods in their fields leaves them more skeptical that the kind of intersecting feedback loops of progress will kick in anytime soon.
posted by hoople at 6:07 AM on June 15, 2010 [5 favorites]


hoople: It's not just engineers and computer scientists (many of whom are actually quite willing to consider the socio-economic end of their work) who are victim to this. Darwin dabbled in human psychology, B. F. Skinner in political philosophy, and Dawkins quite famously in semiotics.

People who claim that technology drives economics and culture need to explain why the steam engine languished as a novelty for 1600 years, and not for lack of people tinkering with it.
posted by KirkJobSluder at 6:27 AM on June 15, 2010 [2 favorites]


I do not have a clue why he is so fearful of dying

Not a clue? For people who a.) enjoy their lives and b.) don't believe in an afterlife, dying can be a pretty terrifying prospect.
posted by adamdschneider at 9:07 AM on June 15, 2010


For people who a.) enjoy their lives and b.) don't believe in an afterlife, dying can be a pretty terrifying prospect.

Can be, not necessarily though.
posted by KirkJobSluder at 9:08 AM on June 15, 2010


What I'm getting at is that, aside from my view of the Singularity as being both improbable and a sort of substitute for that hope that, somehow, we will all live on in some fashion, you can drop in "religion" for "science" up until you remember that technology has produced Things You Can Pick Up and Use to Make New, Different Things Happen. It is both tangible and empowering. That's the allure of The Rapture for Nerds variant — it fulfills all kinds of desires while simultaneously at least attempting to touch reality. I suppose you can use rosary beads to strangle people, but that isn't exactly what I had in mind.

That photocopier down the hall looks like "industry" now, but it was once the photoelectric effect, subject of one of Einstein's papers in 1905 and earning him the Nobel some decade and a half later. From cutting edge science to product that people don't even consider as being science in just a handful of decades.

Thus, the Rapture metaphor falls apart when you look at it closely. It's good for the wish fulfillment half, but portions of it might be allowable by the laws of physics. It's our very own Tower of Babel. Science can be used to treat and even cure disease, but with religion, for all of the anecdotal data, you're stuck wondering why God hates amputees. Right now, I'm tracking three different technologies for hopefully making new teeth — not implants — teeth made out of you. That's new for humanity and different for us collectively. It isn't some afterlife pushed off conveniently behind a veil, unreportable, held on faith. Here, you can go and look at pictures of little teeth growing in rodents. Kids are wandering around with ears grown to replace missing ears.

You can go and see this right now, which seems a lot more concrete than the old shuck and jive about the end times being near. That's why the Singularity, as a concept, is different and, if we're being honest, kind of scary; if humanity can get its collective act together and, oh, not kill ourselves or starve for energy, water, and resources (I'm dubious), the future could get weird.

If you could properly take some simple eukaryote and explain to it what life would be like as one of the cells sitting in my toe (its Singularity, a billion years on), it might just wave its little flagellum and zip away.

That's what makes the Singularity University interesting. It's not just the Moody Bible Institute with some underlighting, more iPhones, and "singularity" replacing "eschaton." Aside from the "hey yah give us lots of money for a vaguely defined set of concrete goals," something arising out of it could actually happen. On a personal level, I'd like to see more prize-oriented research funding, instead, like, say, ten billion dollars for the ability to store X megajoules in a liter of space, with a mass under Y and a recharge time under Z, but this isn't the worst start to it.

Lest anyone accuse me of optimism (I would offer pistols at dawn, but I'd probably get shot), I don't think it's likely, though not because of physical limits.

I believe AI could exist, in that I can envision intelligence (if that's what we have) running on hardware which is not identical to the human brain. I don't think we're necessarily smart enough to figure it out.

I believe practical mass-energy conversion, starting with sustainable better than break-even fusion, can happen. I have a good example of the latter overhead at the moment. We might not be able to pull it off, or it might not be available from an engineering standpoint without your confinement looking like a gravity well.

I believe it is possible to make sentient biological organisms capable of living hundreds of years running at least at our current clockspeed. Nature gives us turtles which have double the human lifespan; evolution cares nothing for how long something lives, the number of viable offspring you produce is much more important. I'm not sure we can give up our hangups enough to make it happen.

We might not let ourselves live long enough to get all of this together. If it doesn't come to pass, it will be a human failing, though perhaps a kind one.
posted by adipocere at 10:17 AM on June 15, 2010 [1 favorite]


"Kurzweil is a very smart guy but I do not have a clue why he is so fearful of dying."

Based on what I read in the NYT article yesterday, he lost his father at an early age, so I'm guessing that's a big part of it. Losing my dad has made me embrace death, and therefore spend my time living life to its fullest. Looks like Kurzweil has instead rejected death and decided to live in perpetual fear of it (he may tell himself he's not afraid, and that he'll conquer it, but seeing it as something to conquer and spending most of your time fighting it is the same thing as "afraid" to me).

"Why Smart People Believe Weird Things"

And what's great about Singularity Camp is that the barriers to entry assure that they'll get few if any dissenting voices. Hooray for groupthink and echo chambers! Though the article makes it sound like it's mainly entrepreneurs looking for networking and their next big idea. I suppose Kurzy doesn't care, as long as the checks clear so he can keep paying for his pill habit.

"I propose the anti-singularity: the Universality."
This is another thing that bugs me, while I'm talking about groupthink. Once we're all uploaded, what stops us from going Borg as we all become one giant superorganism? Dissent is not merely discouraged; it becomes impossible.

"And on top of that, the whole thing is based on a claim that observed exponential factors will continue to behave exponentially for the foreseeable future."

See, that's the thing. The term "singularity" was applied because there's supposedly an "event horizon" that we can't see past. But, you know, even though we can't see past it, we know that exponential growth will continue!

I hold a lot of singularity claims suspect because they just seem to be based on fundamentally flawed assumptions. First of all, we don't even really know what intelligence is. We think we're intelligent, but I think that just means that our brains cannot adequately explore how our brains work. A box cannot contain itself. We have a lot to learn about how we're programmed before we can program artificial intelligences where we can upload our minds (especially if we want anything more than living death, where instead of moving forward, we revert back to our most base survival instincts as we worry about maintaining the integrity of our program). We're still very much apes, and if you want to be free to create eternally, you need some of the animal instincts and needs that inform our human creativity. Otherwise, what's the point? Singularitarians remind me very much of the kid down the block who used every cheat code, Nintendo Power trick, and Game Genie code to beat his new games as quickly as he got them. Once we "solve" life, what's left? The rest of us kids used to get Mr. Speedrun's used games at a heavy discount because there was nothing left for him to do once he'd got to the endpoint. The point of the journey is not to arrive, and the point of life is not simply to keep on living (though that is our base instinct; an instinct which seems to have Mr. Kurzweil firmly in its grip).
posted by Eideteker at 12:47 PM on June 15, 2010 [1 favorite]


An additional source of skepticism is the general lack of experts in non-computer-related fields amongst the singularitarians

You mean, like here, on the blue? Again, I see the critics here throwing the baby out with the bathwater. I'm no blind cult follower, but I know some very serious people who are open about the possibility of technology informing itself. What does that mean? It means that the hypothetical strong convergence of AI/Robotics/Nanotechnology/Proteomics/etc. might very well lead to a scenario where machine intelligence, combined with our own, is moving so fast on the evolutionary scale that it becomes impossible to predict forward scenarios. It also might mean changes in the current formula we know of as "mortality". Maybe it won't, but who is anyone to definititively write this idea off before we have a chance to see what happens 30-40 years out (as a *beginning*, not the end). Kurzweil does not say that we are going to be in Nirvana, nor does he weigh certain cultural forces (that, btw, are themselves impacted by technology). Is Hawking a kook? I don't think so. Maybe he's wrong about us having to merge with technology to keep up. Let's wait and see.

So, rather than dismissing Kurzweil's hypothesis as quackery, why not treat it within the same spirit that some of the more weighty criticisms here present with - i.e. as a hypothesis that could be wrong, but may have some germs of truth, or may even be correct? Why the closed minds?

I don't know if Kurzweil's hypothesis is correct, and neither do you. We're going to have to wait to find out. Just because certain well known individuals applied quantum theory (improperly) to large scale events, doesn't mean that the discoverers of quantum theory were kooks. I see lots of dissing of Kurzweil here, in a way that makes the dissers look pretty small. Let's wait and see what happens, instead of making judgments that are little more than ego feeders.

Last, about taking so may vitamins, and other health treatments. So what? How many people here engage in rituals that they believe will keep them alive and healthier, longer - that ultimately won't work, or may even be counterproductive? Let the guy engage his obsessions; we all have them. And let's quit the personal attacks on an intelligence, again, that very few, if any on the blue can match. It's OK to attack an idea, but leave the person alone. Kurzweil does not encourage "believers", but his ideas are fertile for cultists to borrow. Kurzweil can't keep that from happening, nor could Jesus or Buddha. (and no, I'm not saying that Kurzweil is god)
posted by Vibrissae at 6:32 PM on June 15, 2010


The problem I have with Singularitarians is not that the Singularity idea is particularly implausible -- the rough outline of the story they tell makes a kind of sense, even if the timelines are a bit optimistic. Rather, the problem is that the technologies involved are so far removed from the present that there's almost nothing you can seriously discuss today. There's enough content to write good speculative fiction, but not nearly enough to make for a serious academic study, let alone a University. For all the attention they've given to it, Kurzweil et al. know no more about artificial intelligence and mind uploading than H. G. Wells and Jules Verne knew about moon travel.

A: "One day men will walk on the moon -- just imagine what it will be like!"
B: "Well, what will it be like?"
A: "Who knows? It will be an inflection point in our understanding of the world. And necessarily, like all exploration, it will reveal things of which we presently know nothing. How exciting!"
B: "I suppose it would be exciting. But is it even possible? How will we get there?"
A: "It's certainly possible -- the amount of power required to overcome the Earth's gravitation has been calculated, and with suitable choice of fuel such power could be attained."
B: "Very interesting! Tell me more about these fascinating ideas."
A: "Er, well... that's about all I know at the moment. But look, here's a pamphlet from the Selenarity Institute about the newest discoveries in cannon boring..."
posted by logopetria at 11:41 PM on June 15, 2010 [1 favorite]


For all the attention they've given to it, Kurzweil et al. know no more about artificial intelligence and mind uploading than H. G. Wells and Jules Verne knew about moon travel.


More nonsense. The only cult of belief I'm perceiving on this thread is the Cult of Anti-Singularitarianism. Not one person on this thread had the breadth of scientific knowledge that Kurzweil brings to the plate. Yeah, maybe he's wrong, but if he is, it will probably be due to factors that are so far beyond the bad (and blatantly hopeful) guesses posited here, it won't even be funny.
posted by Vibrissae at 11:50 PM on June 15, 2010


Your arguments aren't addressing the core of the issue, which is that a lot of this stuff is highly speculative, yet the proponents are presenting it as if it was hard science, e.g. by founding this Singularity University.

I'm sure there are fancy latin names for your positions, but to me they boil down to (1) Kurzweil has good academic credentials, therefore you should believe him (2) The singularity conjecture is essentially unanswerable until said singularity happens, hence we're free to speculate about it. Oh, you also think what logopetria's argument is nonsense, but don't explain why.
posted by Dr Dracator at 12:34 AM on June 16, 2010 [1 favorite]


Vibrissae: Last, about taking so may vitamins, and other health treatments. So what? How many people here engage in rituals that they believe will keep them alive and healthier, longer - that ultimately won't work, or may even be counterproductive? Let the guy engage his obsessions; we all have them.

This is true. I have weird rituals and they may bias my science practice. The So What? is his weird rituals directly inform us of the man's bias. They are a relevant part of evaluating the whole.
posted by bukvich at 4:33 AM on June 16, 2010


It's not about Kurtzweil, his personality, or his brilliance for two reasons. First, he's not the only person advocating the singularity hypothesis, and secondly, we shouldn't judge hypotheses according to the reputations of the people advocating them. Many formal review processes are blind for this reason.
posted by KirkJobSluder at 6:53 AM on June 16, 2010


OK KJS well put. I shall quit harping on his vitamins.
posted by bukvich at 7:53 AM on June 16, 2010


"I have weird rituals and they may bias my science practice. The So What? is his weird rituals directly inform us of the man's bias. They are a relevant part of evaluating the whole."

Including the conclusions you and others here are making about his project, based on your preconceived biases. Right? I just love the way the pre-conceived and misinformed assumptions about Kurzweil keep rolling in, to support personal bias here. And, I could care less whether Kurzweil is right or wrong; we'll let time decide that. I rest my case.

Your arguments aren't addressing the core of the issue, which is that a lot of this stuff is highly speculative, yet the proponents are presenting it as if it was hard science, e.g. by founding this Singularity University.

Science is subject to propositional/hypothetical failure; it's built into the system, just in case you haven't noticed. (Read Popper) That said, why shouldn't *anyone* be willing to present data or information as hypothesis, and let time and/or further experience/experimentation bear that out, or not? Then, make conclusions on the verifiable nature of the hypothesis. The blue is not the place where that process happens, thank you - although some here think their judgment is sacrosanct, and without the bias they accuse Kurzweil of (and don't come back on that, because my mind is completely open to verification, failure, or some verification fo what Kurzweil posits - i.e. I'm not emotionally invested in K's success or failure). i.e. Can you predict the future? (and I'm not asking you to prove a negative here - Kurzweil is positing that these things might happen - will they? Let's wait and see. Isn't that the scientific "way"?). Seems to me that the critics on this thread are trying to equate Kurzweil's hypothesis with things like Scientology, which simply points out the absurd, ridiculous, and preposterous biases of many of the critics on this thread. It's all prejudgment and piling on, before the fact; before the results; before the time range that K suggests some of these things might happen. You can't have it both ways, folks; you can't claim science, and then use petty non-scientific opinion and bias to prove your point. e.g. Kurzweil = L. Ron Hubbard? Ha-ha! Please, give me a friggin' break!
posted by Vibrissae at 6:01 PM on June 16, 2010


Science is subject to propositional/hypothetical failure; it's built into the system, just in case you haven't noticed. (Read Popper)

I have to admit I haven't read much Popper --- has he written anything about offering a $25k 10-week course for enterpreneurs on the may-or-may-not-happen, just-wait-and-see impending techno-apocalypse? Because it's the kind of thing that places someone rather low on my list of people to take seriously.
posted by Dr Dracator at 10:31 PM on June 16, 2010


I have to admit I haven't read much Popper --- has he written anything about offering a $25k 10-week course for enterpreneurs on the may-or-may-not-happen, just-wait-and-see impending techno-apocalypse? Because it's the kind of thing that places someone rather low on my list of people to take seriously.

Dracator, looks like you're coming loaded with nothing but snark. Next.
posted by Vibrissae at 10:39 PM on June 16, 2010


Sorry about that, I'll just go do my homework then.
posted by Dr Dracator at 10:44 PM on June 16, 2010


And, I could care less whether Kurzweil is right or wrong; we'll let time decide that. I rest my case.
...
Not one person on this thread had the breadth of scientific knowledge that Kurzweil brings to the plate.
...
The blue is not the place where that process happens, thank you - although some here think their judgment is sacrosanct, and without the bias they accuse Kurzweil of (and don't come back on that, because my mind is completely open to verification, failure, or some verification of what Kurzweil posits - i.e. I'm not emotionally invested in K's success or failure).


Hi Vibrissae:

So to summarize, nobody here has the credentials to criticize Kurzweil's hypothesis, and the blue is not a place to falsify hypotheses? You've basically just placed the hypothesis under discussion out of bounds by an appeal to authority, as well as an unproven premise (Kurzweil is such a genius that discussion of his log-log charts and thought experiments is beyond us). As KJS stated, Kurzweil's hypothesis stands by itself, regardless of our high or low opinions of K. himself. Linus Pauling and Isaac Newton were both phenomenally bright people, but that makes no difference to my opinion of Vitamin C megadosing and alchemy.

There have been many good points brought up (among the snark) about why Kurzweil's vision is simplistic and extremely difficult to disprove (and thus a bad hypothesis), as well as the host of criticisms on Wikipedia.

Personally, I think most of Kurzweil's claims of the future will be borne out (immortality, machine intelligences, hybrids, etc.) but his vision is mostly irrelevant to the practical problems of implementing such improvements (as in the Jules Verne comparison made earlier), and he really underestimates the complexity and hard physical limits of some of the goals (mostly biological, since that's what I'm most familiar with -- e.g. increased sequencing speed does make certain problems tractable but the host of subsequent hypotheses generated immediately run into several brick walls which limit the rate of real progress). I've seen him talk but it wasn't long enough for him to discuss his concept of the singularity as anything but descriptive; it wasn't clear to me that the concept is helpful in achieving the vision.
posted by benzenedream at 2:33 AM on June 17, 2010


Vibrissae: "Not one person on this thread had the breadth of scientific knowledge that Kurzweil brings to the plate."

That may be so, but I stand by my claim that Kurzweil knows very little about artificial intelligence and mind uploading, because these are subjects about which very little is known. Likewise for all the scientific fields that are relevant to Singularity issues. This isn't nonsense, it's a simple summary of humanity's present state of knowledge. And it's not a dig at Kurzweil specifically: I'm willing to believe that he has an excellent command of all that's known about these relevant subjects. I'm just saying that this doesn't add up to much, as yet.

"Kurzweil is positing that these things might happen"

Sure, and so are a lot of sci-fi authors. They speculate about what might happen and sell it for $25 a time on Amazon. They don't set up a "University" and charge $25,000. That's what's provoking a lot of the scorn in this thread. When you're teaching people well-established subjects with a lot of solid content in them, you can justify charging a few thousand dollars. When your course content is "Here's a few things that may or may not pan out; now let's speculate while we wait and see", that kind of fee is harder to justify.
posted by logopetria at 2:34 AM on June 17, 2010 [1 favorite]


When your course content is "Here's a few things that may or may not pan out; now let's speculate while we wait and see", that kind of fee is harder to justify.

You mean like a lot of costly courses on Entrepreneurship in the business schools at Harvard and Stanford? You mean like the very well (and expensive) bets made by venture capitalists? You mean the costly trips many hundreds of years hence to discover the new world? You mean the best guesses we made about getting to the moon? Just funnin'. In fact, Kurweil is speculating. Lighten up and enjoy it, and if some people want to pay for the privilege to "follow" Kurzweil, so what? They're not hurting anyone, and neither is Kurzweil. Bottom line, he doesn't deserve vitriol (in my book, at least). Personal attacks on the guy are (IMHO) off base. He's done a ton of good for mankind.

Personally, I think most of Kurzweil's claims of the future will be borne out (immortality, machine intelligences, hybrids, etc.) but his vision is mostly irrelevant to the practical problems of implementing such improvements (as in the Jules Verne comparison made earlier), and he really underestimates the complexity and hard physical limits of some of the goals (mostly biological, since that's what I'm most familiar with -- e.g. increased sequencing speed does make certain problems tractable but the host of subsequent hypotheses generated immediately run into several brick walls which limit the rate of real progress). I've seen him talk but it wasn't long enough for him to discuss his concept of the singularity as anything but descriptive; it wasn't clear to me that the concept is helpful in achieving the vision.

Well said. Frankly, the Singularity meme does little more than suggest a metaphorical framework. A lot of thought and strong educated guesses have gone into this meme. I, for one, would not pursue a degree in Singularity studies, but when one thinks about a lot of the convergent sciences that are forming new scientific fields, as we speak - the irony is that many people are already hard at work developing the Singularity meme, within the course of their everyday cutting edge work, without realizing it. Me? I'll wait and see, and in the meantime enjoy reading about Kurzweil's and other speculations about the future. I take Kurzweil seriously; he's a heavyweight. Maybe he's wrong, but what he's proposing is thought-provoking and stimulating.

Anyone here can venture to criticize his ideas, but take care about venturing into a game with someone who has far more (in this case) intellectual "weight". Kurzweil is scary smart; that doesn't mean he's right. It does mean that at least *some* of what he posits is worth betting on. I'll take that bet.
posted by Vibrissae at 10:17 PM on June 17, 2010


« Older The Bret Easton Ellis meets the press routine.   |   the amazing Washington Phillips, gospel singer Newer »


This thread has been archived and is closed to new comments