Reinvent Yourself
April 26, 2016 5:43 AM   Subscribe

We’re approaching a point where technological progress will become so fast that everyday human intelligence will be unable to follow it. The Playboy Interview with Ray Kurzweil.
posted by T.D. Strange (80 comments total) 14 users marked this as a favorite
 
When people talk about the future of technology, especially artificial intelligence, they very often have the common dystopian Hollywood-movie model of us versus the machines. My view is that we will use these tools as we’ve used all other tools—to broaden our reach. And in this case, we’ll be extending the most important attribute we have, which is our intelligence.
Obviously, as anyone who has watched the development of the World Wide Web since 1994 can attest.
posted by entropicamericana at 5:59 AM on April 26, 2016 [16 favorites]


My view is that we will use these tools as we’ve used all other tools—for scams, porn, and war.
posted by Foosnark at 6:07 AM on April 26, 2016 [22 favorites]


According to some views, we won't be extending our intelligence, we'll be duping it. Pulling the wool over our own eyes, as the Church of the SubGenius says.
posted by fleetmouse at 6:10 AM on April 26, 2016 [1 favorite]


I thought the Singularity had been laid to rest?
posted by infini at 6:35 AM on April 26, 2016 [1 favorite]


No, it's just been moved back to 2045. ~30 years from now, as ever.
posted by BuddhaInABucket at 6:46 AM on April 26, 2016 [13 favorites]


I thought Playboy had gotten out of the porn business.
posted by Slothrup at 6:53 AM on April 26, 2016 [4 favorites]


My view is that we will use these tools as we’ve used all other tools—for scams, porn, and war.

And cat videos -- surely the greatest achievement of any modern technology
posted by briank at 6:57 AM on April 26, 2016 [10 favorites]



I thought Playboy had gotten out of the porn business.

I only read it for the tentacles
posted by thelonius at 7:01 AM on April 26, 2016 [8 favorites]


Have the singularity proponents actually got anything backing up their assertions that the singularity will be achieved in our lifetime? It seems like the basic logic is that moore's law + machine learning and viola Godlike AI.

Even though many of the essential building blocks that would presumably be pre-requisites for actually AI haven't really been developed yet. That and we are starting to run up against some hard limits in terms of how many transistors we can continue to pack onto silicon wafers without making some really significant advances.

It just seems like there is a belief that we can maintain a massive increase in technical and scientific knowledge despite the tendency for the human race to enter long periods of relative technological plateaus.
posted by vuron at 7:07 AM on April 26, 2016 [3 favorites]


And cat videos -- surely the greatest achievement of any modern technology.

Thanks for giving me the occasion to share Blaise Pascal's most prophetic pensée that attests to his brain already being in the cloud: "Who can be unaware that the sight of cats [...] can unhinge reason completely?"
posted by sapagan at 7:15 AM on April 26, 2016 [4 favorites]


I have three seconds to come up with something clever to say, and the 300 million modules in my neocortex won’t cut it. I need a billion modules for two seconds.

My mom worked for Kurzweil Music Systems back in the 1980s. She was a receptionist. Around our house he was known as Ray. Ray, as in "Ray can't run a business for shit."

One day at the front desk my mom told Ray a joke. Later that day, Ray was giving a company-wide talk and he told the same joke without giving my mom credit.

Ray, my friend, you don't need a billion more modules to come up with something clever to say, you just need a short little Irish woman with a quick wit from whom you can steal jokes and not give credit. Fucker.
posted by bondcliff at 7:28 AM on April 26, 2016 [75 favorites]


Therapy for grief and dealing with the death of a beloved parent might be helpful.
posted by infini at 7:31 AM on April 26, 2016 [2 favorites]


We'll know when the Singularity arrives as Ray Kurzweil will stop talking about it.
posted by Damienmce at 7:33 AM on April 26, 2016 [3 favorites]


These are Ray's predictions for 2019, from his 1999 book, the Age of Spiritual Machines:

• A $1,000 computing device (in 1999 dollars) is now approximately equal to the computational ability of the human brain.
• Computers are now largely invisible and are embedded everywhere-in walls, tables, chairs, desks, clothing, jewelry, and bodies.
• Three-dimensional virtual reality displays, embedded in glasses and contact lenses, as well as auditory “lenses,” are used routinely as primary interfaces for communication with other persons, computers, the Web, and virtual reality.
• Most interaction with computing is through gestures and two-way natural-language spoken communication.
• Nanoengineered machines are beginning to be applied to manufacturing and process-control applications.
• High-resolution, three-dimensional visual and auditory virtual reality and realistic all-encompassing tactile environments enable people to do virtually anything with anybody, regardless of physical proximity.
• Paper books or documents are rarely used and most learning is conducted through intelligent, simulated software-based teachers.
• Blind persons routinely use eyeglass-mounted reading-navigation systems. Deaf persons read what other people are saying through their lens displays. Paraplegic and some quadriplegic persons routinely walk and climb stairs through a combination of computer-controlled nerve stimulation and exoskeletal robotic devices.
• The vast majority of transactions include a simulated person.
• Automated driving systems are now installed in most roads.
• People are beginning to have relationships with automated personalities and use them as companions, teachers, caretakers, and lovers.
• Virtual artists, with their own reputations, are emerging in all of the arts.
• There are widespread reports of computers passing the Turing Test, although these tests do not meet the criteria established by knowledgeable observers.


Ray's full of shit and terrified of his own mortality. Why are we still listening to him?
posted by leotrotsky at 7:34 AM on April 26, 2016 [27 favorites]


Here are his predictions for 2009, by the way:

• Most books will be read on screens rather than paper.
• Most text will be created using speech recognition technology.
• Intelligent roads and driverless cars will be in use, mostly on highways.
• People use personal computers the size of rings, pins, credit cards and books.
• Personal worn computers provide monitoring of body functions, automated identity and directions for navigation.
• Cables are disappearing. Computer peripheries use wireless communication.
• People can talk to their computer to give commands.
• Computer displays built into eyeglasses for augmented reality are used.
• Computers can recognize their owner's face from a picture or video.
• Three-dimensional chips are commonly used.
• Sound producing speakers are being replaced with very small chip-based devices that can place high resolution sound anywhere in three-dimensional space.
• A $1,000 computer can perform a trillion calculations per second.
• There is increasing interest in massively parallel neural nets, genetic algorithms and other forms of "chaotic" or complexity theory computing.
• Research has been initiated on reverse engineering the brain through both destructive and non-invasive scans.
• Autonomous nanoengineered machines have been demonstrated and include their own computational controls.

posted by leotrotsky at 7:37 AM on April 26, 2016 [6 favorites]


I'm amazed that people take this clown seriously. He is the kind of guy this world has way too many of, who can say things about topics you don't know much about that seem to make sense but whenever he strays into a topic you happen to have any expertise in shows just how catastrophically naiive and unjustifiably confident he is in his own lack of deep knowledge about anything. As a molecular microbiologist I can tell you that he sounds like a pathetically bullshit prone undergrad whenever he talks about things like the infection risks of implants, or the underlying biology of cognition, or genetics, or 'nanobots' designed to interact with human systems. He also basically comprises the woo woo bullshit end of synthetic biology that produces all of the smoke and none of the fire while basically existing to con people into believing that they're smarter than the real scientists doing real work.
"That’s another example of exponential growth: HIV took five years to sequence; SARS took 31 days. We can now do it in one day. So we can then very quickly create either an RNA-interference-based medication or an antigen-based vaccine and spread protection quickly if there were an outbreak. This is part of the protocol that emerged from the Asilomar Conference, which established guidelines and ethical standards for responsible practitioners, as well as a rapid-response system just in case."
He is using these big words like someone who doesn't actually know what they mean, or what the challenges actually are for public health responses to novel pathogens, but desperately wants to seem like he does. Besides, the reason why its absurd to fear the potential for advances in recombinant biology to be applied to bio-terrorism is that, or better or worse, both the United States and the Soviet Union perfected the destructiveness of bioweapon technology well past the horrific point where it could ever conceivably be 'improved' in the 70s with a collection of terrifying bacteria and viruses. The techniques involved we largely all available at the turn of the last century. Since then the only work that has been done with bioweapons has been on countermeasures against them, which idealistically would be for defensive purposes and cynically would only make their deployment more useful.
posted by Blasdelb at 7:38 AM on April 26, 2016 [16 favorites]


"Here are his predictions for 2009, by the way:"

So, what you're saying is, is that he's 7 years off?
posted by I-baLL at 7:40 AM on April 26, 2016 [11 favorites]


"In the future AI's will become bullshit futurologists doing speaking engagements to management consultants"
posted by Damienmce at 7:41 AM on April 26, 2016 [3 favorites]


Exponential growth.
posted by bukvich at 7:42 AM on April 26, 2016 [3 favorites]


No, it's just been moved back to 2045. ~30 years from now, as ever.

Well, we'll have had nuclear fusion for 20 years by then, so that'll help.
posted by leotrotsky at 7:42 AM on April 26, 2016 [9 favorites]


Ray's full of shit and terrified of his own mortality. Why are we still listening to him?

Because he will go through that list and explain how he predicted e-readers, self-checkout and supermarkets, Google glasses, the Internet of Things, and MOOCs. Journalists wanting "big idea" stories will let him. If one does a mild criticism ("doesn't look like we'll have cured paralysis by 2019, Ray") he will explain that critics are too stupid to understand exponential growth. Exponential growth means any hint of something justifies Kurzweill being right in just a few years, what with doubling and all.

In the article he claims he's been "consistent on dates" for decades, but says the singularity is 2045. I could have sworn it was 2035.

He's such a con-man. Maybe he's like Harold Hill in The Music Man and forgets there's not really a band, but he's still peddling crap to convince other people. Why reasonable competent people, whether at google or generally solid shows like On The Media give him the time of day I don't know.
posted by mark k at 7:45 AM on April 26, 2016 [1 favorite]


The man is basically OMNI magazine incarnated in human form.
posted by thelonius at 7:46 AM on April 26, 2016 [15 favorites]


"Here are his predictions for 2009, by the way:"

So, what you're saying is, is that he's 7 years off?


Notice his 2009 predictions are of three types:

1. Prosaic and already in existence in 1999. People can talk to their computer to give commands.
2. Vague enough to be handwaved away. Research has been initiated on reverse engineering the brain through both destructive and non-invasive scans. (what exactly does that mean, "research initiated")
3. Wrong. Autonomous nanoengineered machines have been demonstrated and include their own computational controls.
posted by leotrotsky at 7:47 AM on April 26, 2016 [11 favorites]


The thing that's always puzzled me about this particular strain of belief, is that it treats technological development as being this process that takes place externally to people and the societies in which they exist. Which is weird because Kurzweil's actually invented things, and he presumably spent time and energy thinking about the problem he wanted to solve, and how to solve it.

I suppose it speaks either to his capacity for creating things or his self-regard. Probably both...
posted by phack at 7:49 AM on April 26, 2016


Kurzweil defends his predictions in this PDF and gives himself an (optimistic) score of 86%. Maybe he overestimated the longevity of Moore's Law ($1,000 buys you a 7 TFOPS nVidia card) but we do have LTE, iPad, Voiceover, SSDs, Oculus, Kindle and Fitbits.

Maybe he should lay off on the skeezing on young pop stars, though.
posted by RobotVoodooPower at 7:56 AM on April 26, 2016


There's been talk that Moores "law" is ending. Probably true for regular pc cpu chips, but the cost/performance curve for other elements (memory, storage, GPU's) are all looking more like exponential (ya ya probably an "S" curve but getting steeper right now). And it is looking like lots of data may be more important than super clever AI algorithms.

For a number of years I had a rough estimate for my own observance, "cost of a gigabyte". I've skipped the next order of magnitude, TB's are not free yet but pretty darn cheap. So How much does a Petabyte (1000 GB)? It's a bit tricky at the moment but probably less than a top end sports car.

The buzz quote at the RMV crap news headline banner was "a single web page today larger than the doom download", bandwidth is exploding, don't have a curve handy but fiber tech is not standing still.

Recent headline, "chip in paralyzed woman allows her to move fingers"

When will the implant tech take hold? Who would implant for "fun" anyway? Who would pierce themselves for fun anyway? Who doesn't have a cell phone? The first minimally useful implant tech will explode so fast when it comes out that months from time of the first "p'lants" the slow adopters will just get pitying looks.

Will Kurzweil have had the details exact? Ha, but in 2030 in retrospect it'll be more like, well he tried to be outrageous but didn't have close to enough imagination.
posted by sammyo at 7:58 AM on April 26, 2016 [2 favorites]


I don't know. My favorite story about not being able to come up with something to say is the one where Tom Hanks runs into Tom Selleck at the pisser and turns to him and says, "Looks like we're a couple of peeing Toms!" I don't think Tom Hanks would have needed nanotechnology or the cloud to access a simultaneously perfect and dumb one-liner like that, and I doubt that the cloud would have avoided Tom Selleck's angry silence after Hanks made the comment, either.
posted by blucevalo at 8:06 AM on April 26, 2016 [4 favorites]


There's been talk that Moores "law" is ending. Probably true for regular pc cpu chips, but the cost/performance curve for other elements (memory, storage, GPU's) are all looking more like exponential (ya ya probably an "S" curve but getting steeper right now). And it is looking like lots of data may be more important than super clever AI algorithms.

Six years ago I bought a 1TB external USB drive from a shop in town for $69. I checked their web site today and a similar* 1TB external drive is going for...$69. It's not the only case where I'm seeing a distinct slowing down of Moore's law.

*It's USB 3.0, so somewhat better than the one I bought.
posted by rocket88 at 8:11 AM on April 26, 2016


One immediately notices his lustrous and almost plastic-looking skin—a byproduct of supplementing his diet with phosphatidylcholine.

My prediction for 2019: Ray Kurzweil will have shed his skin, which will then be repurposed to make iPhone 23 cases.
posted by aeshnid at 8:15 AM on April 26, 2016 [1 favorite]


There were a few articles about the end of Moore's law linked on the blue a while back- in my mind the most exciting thing about it is that it means there will be a burst of innovation in software/coding, because you won't be able to depend on cheap hardware to do the work for you. I think that will lead to some big advances. Other than that, we're waiting for quantum computing, aren't we?
posted by BuddhaInABucket at 8:15 AM on April 26, 2016


What's the difference between "the slowing of Moore's Law" and various schemes to defeat the rules of supply and demand to keep prices up? Because, you know, if we can get people to believe that $69 is the natural starting point for 1tb hard drives, there's nowhere to go but up.
posted by sneebler at 8:17 AM on April 26, 2016 [2 favorites]


Reading the title of this post went something like:

Hey well interesting concep... oh Kurzweil.
posted by Splunge at 8:20 AM on April 26, 2016 [7 favorites]


What's the difference between "the slowing of Moore's Law" and various schemes to defeat the rules of supply and demand to keep prices up?

See also: Entry-level storage in Macs and iOS devices.
posted by entropicamericana at 8:34 AM on April 26, 2016 [1 favorite]


Six years ago I bought a 1TB external USB drive from a shop in town for $69. I checked their web site today and a similar* 1TB external drive is going for...$69. It's not the only case where I'm seeing a distinct slowing down of Moore's law.

There are price floors on things though. Now you can get 4TB drives for the former price of 2TB drives. Heck, I bought a 2 TB drive for #89 over the weekend which seems like it's a lot cheaper than 6 years ago.
posted by GuyZero at 8:40 AM on April 26, 2016


Even though paper is obsolete, for some reason I still have my copy of his 1990 book The Age of Intelligent Machines, in which we learn:
- Once we have miniature computers more or less like iPads, children will enjoy "optimal education" because they are able to converse with realistic simulations of the Founding Fathers. The transformation of education should be complete by 2010.
- Heart disease and cancer are likely to be conquered early in this century.
- People will communicate over long distances using robotic imitators that mimic their movements and expressions. [???]
posted by mubba at 8:45 AM on April 26, 2016 [1 favorite]


It helps if you have Ray Kurzweil permanently confused with Ray Kroc, as I do, so every time there's a story where he's making predictions about the future, you go, "Why would I care what the dead CEO of McDonald's thinks about the singularity?" It makes him much less annoying.

(Plus I'm secretly pretty sure this would piss Kurzweil off like whoa.)
posted by Eyebrows McGee at 8:47 AM on April 26, 2016 [9 favorites]


I put this in the Alvin Toffler category.
posted by GallonOfAlan at 8:49 AM on April 26, 2016


The thing that's always puzzled me about this particular strain of belief, is that it treats technological development as being this process that takes place externally to people and the societies in which they exist.

This. I don't think the Singularity as a concept is inherently ridiculous, but technological change isn't something that just happens. You either need massive investment in basic research (usually governmental) or a growing economy whose consumers are willing to spend larger amounts of disposable income on new technologies, or can draw on small-s socialized systems to finance them (does your health insurance plan cover these nanotech T-cells?).

I've been reading The Rise and Fall of American Growth, which has a lot to say about how technological change and its societal effects have actually worked out from 1870 up to the present, and how radical changes in lifestyle due to technology were far more the province of 1870-1940 than they have been since. You saw similiar "exponential growth" trends with the explosion of things like the radio and the automobile over two to three decades as we've seen with computers since the 80's, and with far greater effects on everyday life.

None of this stuff happens in a vacuum.
posted by AdamCSnider at 8:50 AM on April 26, 2016 [7 favorites]


Kurzweil, if you're reading this, try out: "Hey Larry! Working hard.... or HARDLY WORKING??"

He already stole that joke from Carlos Mencia.

Six years ago I bought a 1TB external USB drive from a shop in town for $69. I checked their web site today and a similar* 1TB external drive is going for...$69.

Not to get all Kurzweily, but Moore's Law is just one dimension to the general phenomenon of stuff getting faster, smaller, cheaper, more energy-efficient, etc. It's those other dimensions that are getting attention, now, it seems. But we don't have a catchy phrase for that other than "progress".
posted by a lungful of dragon at 8:52 AM on April 26, 2016


Tl;dr: one of these days! In the very near future! Nanobots will solve all my problems and make people like me and I won't be racked with worry about anything ever again.
posted by The Whelk at 8:53 AM on April 26, 2016 [1 favorite]


I think people are being massively overoptimistic about- Hey! What are these things crawling over me! Oh my God, they're crawling into my eyes! MY EYES! HELP M-

...My, the Whelk truly is an excellent fellow, yes? On that we can all agree!
posted by happyroach at 9:04 AM on April 26, 2016 [2 favorites]


It's very popular recieved idea that technology is a value-neutral entity, that can be employed for any purpose - it's just up to "us" to determine what to use it for. Up to who? Isn't it kind of dangerous to just assume that powerful new breakthroughs (suppose, say, there's at least some sort of little singularity and something like ubiquitous computing, enhanced by AI, emerges) will be used only for extending the glory and freedom and power of the new digital amphibian individual? I wonder if Kurzeil has talked about the dystopian potential for supposedly exponentially-growing tech. Isn't it likely to be used by those who already control wealth and power, to consoldate their position?
posted by thelonius at 9:06 AM on April 26, 2016 [2 favorites]


Thing is, though, Kurzweil doesn't (or maybe he does, which I guess would reduce my regard for him even further) recognize or account for the fact that all his utopian woowoo is predicated on a globally oppressed underclass making all these devices cheaper and cheaper, and the massive energy demands for a bunch of head-in-the-clouds privileged rich guys to live incrementally longer are fucking up the planet longterm. Dude, just accept your fucking mortality already. The rest of us have no desire to pay for your bullshit utopia of theoretically digitized white dudes.
posted by Existential Dread at 9:08 AM on April 26, 2016 [11 favorites]


mubba: "Even though paper is obsolete, for some reason I still have my copy of his 1990 book The Age of Intelligent Machines"

There's something oddly satisfying about the fact that there is no electronic version of this book available anywhere, as far as I can tell.
posted by crazy with stars at 9:14 AM on April 26, 2016 [1 favorite]


Kurzweil working at Google is a sign of the Apocalypse. Luckily all his dog robot drones will fall over when crossing ditches.
posted by benzenedream at 9:17 AM on April 26, 2016


The chief problem with the singularity is that technology is a socioeconomic system, and you can't just cut out the rate-limiting problems of human labor and economics out of the system. For example, exponential increases in computer memory did not make the labor of working with words exponentially more efficient. In fact, it often became less efficient because people using word processors took on the additional labor of more iterations and typesetting.

The industries that had the greatest gains from computerization did so by raw labor replacement, not labor enhancement. Telecom got rid of an entire segment of their technical workforce that had previously been feminized in order to reduce labor costs.
posted by CBrachyrhynchos at 9:20 AM on April 26, 2016 [2 favorites]


It's like " will any of this happen before the majority of the earth's landmass is uninhabitable by humans?" So I'm going with no.
posted by The Whelk at 9:25 AM on April 26, 2016 [2 favorites]


how many friedmans in a kurzweil?
posted by j_curiouser at 9:52 AM on April 26, 2016 [6 favorites]


What I really think about the singularity is that there's gonna be like three or four of 'em.
posted by newdaddy at 9:57 AM on April 26, 2016 [1 favorite]


Ah, but a man's reach should exceed his grasp, Or what's a heaven for?


You guys are downers.
posted by Artful Codger at 10:00 AM on April 26, 2016 [2 favorites]


Ah, but a man's reach should exceed his grasp, Or what's a heaven for?

Damn straight. I encourage you all to check out my upcoming book:

In Five Years We'll All Be Unicorns: No, It's Not a Metaphor, We'll Literally Be Unicorns
posted by leotrotsky at 10:09 AM on April 26, 2016 [10 favorites]


It helps if you have Ray Kurzweil permanently confused with Ray Kroc, as I do, so every time there's a story where he's making predictions about the future, you go, "Why would I care what the dead CEO of McDonald's thinks about the singularity?"

Ironically, Kroc was equally obsessed with finding the secret to immortality. He spent the last two decades of his life funneling millions of dollars into artificial life extension research. On January 14, 1982, mere seconds before his death, a team of McDonalds scientists were able to successfully upload his consciousness into a giant hamburger, creating a colossal, golem-like entity which they jokingly (if nervously) dubbed Mayor McCheese.

The Object lived for six days and seven nights, lumbering silently throughout the halls of the laboratory, clumsily knocking equipment to the floor while occasionally attempting to physically interact with objects, but not people. Never people. In fact, it appeared to be completely oblivious to the existence of any organic thing, whether animal, vegetable, or mineral.

On the last night, the thing went missing for three hours, causing a panic throughout the building. In the end, a night watchman discovered the carcass curled up in the back corner of a storage closet, its bun-like skin moldy and cold to the touch. On the floor, scrawled in ketchup, were the barely legible words "EMPTYE SPACE".
posted by Atom Eyes at 10:12 AM on April 26, 2016 [16 favorites]


Six years ago I bought a 1TB external USB drive from a shop in town for $69. I checked their web site today and a similar* 1TB external drive is going for...$69. It's not the only case where I'm seeing a distinct slowing down of Moore's law.

No, you're seeing a slowdown in optimization of obsolete technology. There's no money going into making a 1TB hard disk drive cheaper - it's going to making 1TB solid state drives faster, more reliable and cheaper. SSDs are amazingly inexpensive, and getting moreso. They will overtake spinning platters, as silicon fabrication is a lot harder to set up initially, but much easier to crank out chip after chip after chip, very cheaply, once the initial investment is paid for.

More, there are knock-on effects for cheap, reliable and voluminous SSD storage - we're coming into an era where $400 Android flagship phones are the rule rather than the exception. This is Moore's Law in action.
posted by Slap*Happy at 10:15 AM on April 26, 2016 [3 favorites]


Thing is, though, Kurzweil doesn't (or maybe he does, which I guess would reduce my regard for him even further) recognize or account for the fact that all his utopian woowoo is predicated on a globally oppressed underclass making all these devices cheaper and cheaper, and the massive energy demands for a bunch of head-in-the-clouds privileged rich guys to live incrementally longer are fucking up the planet longterm.

Nah he just thinks machines are going to take over all the production also fixing the environment.
posted by atoxyl at 10:44 AM on April 26, 2016 [1 favorite]


Anyway, predictions of exponential improvement that follow directly from Moore's Law don't impress me that much. Predictions of exponential improvement that Ray you can't have an exponential when it's not quantitative you're just putting arbitrary milestones in human innovation on a graph and saying 'exponential'...
posted by atoxyl at 11:02 AM on April 26, 2016


What I really think about the singularity is that there's gonna be like three or four of 'em.

Well you clearly don't grasp the power of tar:

> tar -cfz /var/local/singularity | (ssh million.vm.instances.com; cd /; tar -xfz ., cd singularity; run ./awaken.exe)

rather silly, cludgy fake, pretend example; will not run, will not be needed when she awakens, yes, the singularity is female
posted by sammyo at 11:08 AM on April 26, 2016 [1 favorite]


I actually mentioned transhumanism in a recent article. I called it "a dangerous New Age memetically transmitted infection common to billionaires who want to live forever."
http://gamemoir.com/opinions/ecgc-day-two-day-three-reflections-part-one/
posted by smashthegamestate at 11:15 AM on April 26, 2016 [1 favorite]


The humans will never know what him them...
posted by littlejohnnyjewel at 11:20 AM on April 26, 2016


Transhumanism - is that where everyone uses the same bathroom?
posted by Artful Codger at 11:24 AM on April 26, 2016 [3 favorites]


leotrotsky, you seem like you're trying to read his predictions as prophecy instead of tracing the future of technology. he's been broadly correct on a lot of it, even if the specifics weren't quite predictable decades prior. he can be uncharitably read as a huckster, but in my view he's been mostly right so far.
posted by p3on at 11:38 AM on April 26, 2016 [1 favorite]


If we're talking about wrong visions of the future, I prefer David "Shingy" Shing's -- on the basis of hairstyle alone. That's a digital prophet, my friend. This Kurtzweil cat's just another unfunny clown with a Netflix distribution contract. He's the Ralphie May of digital prophecy.
posted by smashthegamestate at 11:50 AM on April 26, 2016


By the 2030s we will have nanobots that can go into a brain non-invasively through the capillaries, connect to our neocortex and basically connect it to a synthetic neocortex that works the same way in the cloud.

Who's this "we" he's talking about. I'm, not going to have any nanobots connecting my brain to anything, thank you very much.

Sent from my iBrain
posted by Devils Rancher at 11:51 AM on April 26, 2016 [3 favorites]


"No other animal can keep a beat. No other animal can tell a joke."

Crickets cicadas, primitive as they are keep great rhythm, the calls of birds, trills and warbles, are all comprised of rhythm and emphasis, tone, and key.

Many animals play and joke. I think even the great apes trained in speech or signing, joke.

His goals are pathetically mortal. His plans for us as a species also pathetic, and bound to his idealizations, his limited and typical desires. And, he is creepy, and the article is in Playboy.
posted by Oyéah at 12:35 PM on April 26, 2016 [2 favorites]


By the 2030s we will have nanobots that can go into a brain non-invasively through the capillaries, connect to our neocortex and basically connect it to a synthetic neocortex that works the same way in the cloud, with your choice of Google or Amazon Cloud services and their related EULA and terms of service, including an irrevocable, royalty-free, and non-exclusive right use your likeness and replicated cognitive pathways for Google/Amazon's purposes, including but not limited to promotional and commercial use.
posted by Existential Dread at 1:47 PM on April 26, 2016 [2 favorites]


That’s another example of exponential growth: HIV took five years to sequence; SARS took 31 days. We can now do it in one day. So we can then very quickly create either an RNA-interference-based medication or an antigen-based vaccine and spread protection quickly if there were an outbreak. This is part of the protocol that emerged from the Asilomar Conference, which established guidelines and ethical standards for responsible practitioners, as well as a rapid-response system just in case.

This is pretty neat. I didn't know this and never thought about it.
posted by polymodus at 2:01 PM on April 26, 2016


My favorite Ray Kurzweil story, courtesy of one of my favorite writers:

"At a TEDx event i attended, Ray Kurzweil gave a terrible talk and halfway through the TEDx sign fell from where it was hanging above the stage and broke.

It was the best part of the talk."
posted by gramschmidt at 2:52 PM on April 26, 2016


Moore's Law refers specifically to the prediction/observation that the number of transistors in integrated circuits doubles/doubled every two years.

Its being... wildly misused in this thread. It has nothing to do with other rates of technological advancement.

Various types of technology have their own issues and pace. As GuyZero mentioned, for something like hard drives there's a kind of basic price floor (after all, things like the housing/etc are fairly uniform) but the top end has come down a lot. But at this point there's not much benefit for the consumer, as very few people have a use for multiple TB of local storage in a world where cloud storage is often preferred (not by all, but by many/most for the obvious gains in convenience). And cloud storage technologies have their own price/performance concerns (they have much higher disk utilization than a PC drive which spends a lot of time idle, etc).

Among other things, offloading computation from local devices makes it possible to do pretty neat things on phones/etc without having to make them dramatically more powerful (one of many ways to work around the end of Moore's Law CPU progression).

But... there's a difference between "technology will keep getting more powerful" and something like the Singularity. The latter requires more than simply ever-increasing technological "power" and many/most technology people I know (including myself) don't believe it will ever happen. I'm not convinced Strong AI is possible (and even less convinced that it is desirable), or "digitization" (which probably first relies on some kind of Strong AI). And in the unlikely scenario where we have a massively powerful Strong AI, I have no idea why it would care about humans at all.
posted by thefoxgod at 3:29 PM on April 26, 2016 [2 favorites]


The industries that had the greatest gains from computerization did so by raw labor replacement, not labor enhancement. Telecom got rid of an entire segment of their technical workforce that had previously been feminized in order to reduce labor costs.

For what it's worth, "computer" was originally a term applied to a category of feminized worker.


The capability of information technology doubles each year.
[...]
The difference between myself and my critics is that we’re looking at the same reality, but they apply their linear intuition about where we will go, and I’m thinking about it from the exponential perspective.
For someone who loves to talk about exponential growth, Kurzweil seems unwilling to accept that exponential growth must always have limits, instead preferring to extrapolate into the realms of total fantasy. To be fair, this is fun, which is why I like to do it too.

Let's play the chessboard game he is so fond of with the Top 500 List and his absurd annual doubling axiom. As of November 2015, the sum RMax in PFLOPS was 421. For simplicity's sake, I take this as equivalent to 1020 operations/second as defined by discrete quantum states for the purpose of the Margolus-Levitin Theorem, which places a limit for an ideal quantum computer of 6 x 1033 operations per second per joule. If this looks suspiciously like the reciprocal of the Reduced Planck Constant, well, it should. This is basically an accounting of the maximum discrete state transitions for any given amount of time or energy. Assuming a device can perfectly control these states, this corresponds effectively to number of computational operations.

Right now, the Top 500 List could theoretically be replaced by a perfect quantum computer that needed about a single X-ray photon's worth of energy to operate. We are suitably still very micro-scale here, which makes sense, because the Planck Scale is tiny compared to the world of even electrons and photons. But Kurzweil's crazy techno-optimism knows we don't have to settle. By 2045, a year he prognosticates about in this article, the Top 500 list could be replaced by automagical computronium controlling 100 erg worth of Planck-level distinct states. This is a pretty damn macroscopic amount of energy to control so precisely.

At the end of the chessboard, which is only the year 2080, the Top 500 list is hitting on the order of 1040 operations/second, which corresponds to a Honda Civic at highway speeds. I think we are now safely into the realm of the absurdly macroscopic. I don't know what kind of machine needed to do this, but let's just play along and assume this perfect computer has a perfect entropy-reversomat.

But this isn't enough. That only gets us to 2200 or so before our fastest 500 supercomputers collectively perfectly control the total mass-energy of the sun. And then only another century or so until the total mass-energy of the visible universe, including all dark matter and dark energy, is wholly used up performing calculations. I/O is left as an exercise for the reader.

So how does Kurzweil respond to this? Does he accept that physical limits will eventually bend the growth curve down? No. He bites the bullet and accepts the fantastical consequences of adhering to Moore's Law as if it were a physical theorem. He predicts that within a few hundred years, humanity will have turned the whole universe into computronium because wormholes and he would rather extrapolate an observation about advances in semiconductor lithography into an inviolable natural law because his fear of death has eaten the thinky parts.
posted by [expletive deleted] at 3:38 PM on April 26, 2016 [6 favorites]


Kurzweil defends his predictions in this PDF

I'm skimming through this and boy does he try to get himself off on some fun technicalities in places.


PREDICTION: Human musicians routinely jam with cybernetic musicians.
ACCURACY: Correct

DISCUSSION: There are many software packages that will accompany you with rhythm tracks that adjust to your
playing, walking bass lines, and other accompaniments. (Left) Apple’s Magic GarageBand Jam (Image courtesy of Apple Inc.) Such “auto accompaniment” software is also built into home digital keyboards. Games such as Guitar Hero involve computer-generated music tracks in real time. Apple’s GarageBand software includes Magic GarageBand Jam, which lets you jam with a full-screen band.

posted by atoxyl at 4:53 PM on April 26, 2016 [1 favorite]


In Five Years We'll All Be Unicorns: No, It's Not a Metaphor, We'll Literally Be Unicorns


I think you mean Singularicorns
posted by museum of fire ants at 4:59 PM on April 26, 2016 [1 favorite]


PREDICTION: Displays will be built into our eyeglasses and contact lenses and
images projected directly onto our retinas.
ACCURACY: Essentially correct
DISCUSSION: The wording of the prediction here is unclear as to whether it implies that
this technology merely exists or is common or ubiquitous.


YES, IN THIS ONE INSTANCE, YOUR WORDING WAS UNCLEAR
posted by atoxyl at 5:03 PM on April 26, 2016 [1 favorite]


Prediction: computers are here to stay and will only grow in popularity.
posted by museum of fire ants at 5:04 PM on April 26, 2016 [2 favorites]


DISCUSSION: There are many software packages that will accompany you with rhythm tracks that adjust to your
playing, walking bass lines, and other accompaniments


LOL, "Band-in-a-Box" has been on the market since the early 90's, doing exactly this, generating canned accompaniment in various styles for music students.
posted by thelonius at 6:17 PM on April 26, 2016


Let’s say I’m walking along and I see my boss at Google, Larry Page, approaching. I have three seconds to come up with something clever to say, and the 300 million modules in my neocortex won’t cut it. I need a billion modules for two seconds. I’ll be able to access that in the cloud just as we can access additional computation in the cloud for our mobile phones, and I’ll be able to say exactly the right thing.
I know how this ends…

"So I interface with the cloud, calling upon the collective brainpower of all humankind. Intelligent agents traverse the web, sweeping across everything from the World's Great Minds, the rich and famous (and the rich or famous), down to the most everyman of everymen. Everything - from imperfectly-recorded pre-history (as the days before the copying and uploading of intelligences were known), to the whole population present and past dating back to that day the little Sublimation of The Singularity became available to all regardless of wealth, status, or race; the day we finally became a true pan-global Humanity - is, for a single fleeting moment, completely open to them.

Within the eternity of a nanosecond, they sweep the whole corpus of human knowledge and experience, and, discarding almost every notable greeting, quip, salutation and tribute ever uttered, return direct to my mind the most perfect words our collective consciousness can recall or devise for this singular occasion.

And with 2.999999999 seconds remaining I have time to compose myself, turn slightly, smile, and, as secure as only a man who has not simply forseen this future but striven to achieve it can be, I look Larry straight the eye and say …




FUCK YOU, CLOWN!"

posted by Pinback at 7:21 PM on April 26, 2016 [10 favorites]


PREDICTION: Human musicians routinely jam with cybernetic musicians.
ACCURACY: Correct

DISCUSSION: There are many software packages that will accompany you with rhythm tracks that adjust to your
playing, walking bass lines, and other accompaniments. (Left) Apple’s Magic GarageBand Jam (Image courtesy of Apple Inc.) Such “auto accompaniment” software is also built into home digital keyboards. Games such as Guitar Hero involve computer-generated music tracks in real time. Apple’s GarageBand software includes Magic GarageBand Jam, which lets you jam with a full-screen band.


If playing along to canned music subroutines married to tone and key sensing algorithms is "jamming with a cybernetic musician," then exchanging tweets with a Markov bot is having a conversation with an Artificial Intelligence.
posted by Existential Dread at 8:20 PM on April 26, 2016 [1 favorite]


PREDICTION: Human musicians routinely jam with cybernetic musicians.

I'm willing to give this one a mild "correct" given that I'm literally going to a concert next week where human musicians will be playing alongside a hologram of a fully computer generated singer (Hatsune Miku).

I suppose it depends on your definition of "jam", as to be fair Miku will not likely be improvising anything.
posted by thefoxgod at 10:05 PM on April 26, 2016


BuddhaInABucket: "No, it's just been moved back to 2045. ~30 years from now, as ever."

Actually from most estimates I've seen it's somewhere around 2060.

Which bring a big hearty ELL OH FUCKING ELL
posted by symbioid at 10:12 PM on April 26, 2016


I'm willing to give this one a mild "correct" given that I'm literally going to a concert next week where human musicians will be playing alongside a hologram of a fully computer generated singer (Hatsune Miku).

I suppose it depends on your definition of "jam", as to be fair Miku will not likely be improvising anything.


I think he had another prediction which covered that more directly, which is a fair enough hit - note that speech synthesis is an area he's actually worked in. I don't count that as "jamming with cyborg musicians" at all, though it's fairly impressive.
posted by atoxyl at 10:45 PM on April 26, 2016 [1 favorite]


Actually from most estimates I've seen it's somewhere around 2060.

I miss the guy who back in October of 2000 predicted that the self-improving algorithms would kick off the omnipotent AI singularity...within six months. Ah Usenet, how I miss thee...
posted by happyroach at 10:48 PM on April 26, 2016


If my employers allowed me to validate my own work I would be the programmer who always ships perfect bug-free code. According to me.
posted by ardgedee at 4:21 AM on April 27, 2016


« Older Two Good Men   |   Hillsborough disaster: deadly mistakes and lies... Newer »


This thread has been archived and is closed to new comments