Embodied Cognition
January 19, 2015 6:24 PM   Subscribe

The Deep Mind of Demis Hassabis - "The big thing is what we call transfer learning. You've mastered one domain of things, how do you abstract that into something that's almost like a library of knowledge that you can now usefully apply in a new domain? That's the key to general knowledge. At the moment, we are good at processing perceptual information and then picking an action based on that. But when it goes to the next level, the concept level, nobody has been able to do that." (previously: 1,2) posted by kliuless (9 comments total) 42 users marked this as a favorite
 
It's true that some of the hardest AI problems involve transfer, analogy, domain-crossing, relevance, fluid intelligence -- all of the stuff that tiny modular task-specific systems fail at. But there's not even a hint of how to solve that problem here, just gestures towards neural reinforcement (ancient news in both neural net theory and neuroscience) and the hope that maybe something will emerge if we make the system large-scale.

Much worse, though: "One constraint we do have... is that no technology coming out of Deep Mind will be used for military or intelligence purposes." Yeah, right. I'm not reassured at the idea that the most powerful, dangerous computational technology on the planet will be safely kept in the hands of our benevolent corporate overlords. Nor do I think it will be: real AI would immediately be stolen and copied by the state. That genie can't be kept in the bottle, even if it's carefully labeled "Google©"
posted by informavore at 6:45 PM on January 19, 2015 [4 favorites]


I've been told that the Wright brothers didn't want the airplane used for war. Look how that worked out.
posted by Sir Rinse at 7:18 PM on January 19, 2015 [3 favorites]


"One constraint we do have— that wasn’t part of a committee but part of the acquisition terms—is that no technology coming out of Deep Mind will be used for military or intelligence purposes."
I would be fascinated to know how that's enforceable.
posted by amtho at 7:24 PM on January 19, 2015


Much worse, though: "One constraint we do have... is that no technology coming out of Deep Mind will be used for military or intelligence purposes."

It's the observation that the people working on, e.g. leading, these things would view it that way is what is disturbing.
posted by polymodus at 7:27 PM on January 19, 2015


informavore: real AI would immediately be stolen and copied by the state itself.
posted by nfalkner at 7:32 PM on January 19, 2015 [2 favorites]


I was struggling to frame the clip below into a clear context - the way this discussion is going, for some reason it immediately jumped to mind - but I think I'll just let it stand on its own.

There must have been a moment, at the beginning, where we could have said -"No." But somehow we missed it.
posted by chambers at 8:37 PM on January 19, 2015 [2 favorites]


I've been told that the Wright brothers didn't want the airplane used for war. Look how that worked out.

Google said their prime directive is "Don't be Evil." Look how that worked out.

I want to make this guy read Weizenbaum's book Computer Power and Human Reason. I know he hasn't read it, because he's making the same errors of judgment that Weizenbaum criticized in 1976.
posted by charlie don't surf at 10:28 PM on January 19, 2015 [2 favorites]


Cicada 3301
posted by Oyéah at 12:02 AM on January 20, 2015


I would still rather Google supplied my military industrial complex needs than the current bundle of unaccountable agencies. And if I don't like Google then I can just switch my allegiance to Bing or choose Yahoo to defend my freedoms. Open markets in exceptionalism.
posted by vicx at 6:58 AM on January 20, 2015 [1 favorite]


« Older "Justice is Our Creed and the Land is Our...   |   The trigonometry of relationsips. Who is... Newer »


This thread has been archived and is closed to new comments