Join 3,516 readers in helping fund MetaFilter (Hide)


We're one step closer to William Gibson's vision
June 11, 2000 9:33 AM   Subscribe

We're one step closer to William Gibson's vision as reported in today's NY Times Magazine article, "The Mind that Moves Objects."
posted by grumblebee (14 comments total)

 
Woah. From locked in to jacked in.

We may never see William Gibson's vision in total. I question whether people will install such implants to make a fashion statement or just to make it easier to operate a computer. Like the doctor in the piece said, "I can operate a computer fine with my hands." However, human beings are pretty weird. Plastic surgery to 'correct' nature are commonplace today. We have laser surgery that corrects nearsightedness and other things which glasses or contacts are more than capable of handling. Superfluous and unecessary, but easily tolerated behavior.

There was a time when the general consensus of humanity was, "if God meant for us to fly he'd a given us wings." And that mindset is laughable and archaic by today's standards.

That was less than a century ago, before horseless carriages became commonplace, and the works of Jules Verne still seemed outrageous and fanciful (now they just seem outdated). Rockets weren't even a gleam in Bradbury's eye, and Clarke's idea for a communications satelite had not yet been written, much less patented.

I often wonder what our children will do to shock us. Mom and dad used to complain I spent too much time in front of the tv when I should be outside getting sun. Maybe we'll be asking our children why they spend so much time inside the computer when there's a perfectly good sitcom playing.
posted by ZachsMind at 10:07 AM on June 11, 2000


oops. either GrumbleBee or myself forgot to close a tag. There we go..
posted by ZachsMind at 10:08 AM on June 11, 2000


Well that didn't work either. Sorry for the repeats. Hopefully this'll do it.
posted by ZachsMind at 10:09 AM on June 11, 2000


Should I close it now? Did that work?
posted by dogwelder at 10:10 AM on June 11, 2000


Of course, that doesn't help on the main page, but you can't always get what you want. Maybe the MeFi code should scan incoming messages and add missing closing tags to posts.
posted by dogwelder at 10:12 AM on June 11, 2000


And make your morning coffee.
posted by EngineBeak at 12:01 PM on June 11, 2000


Oh, especially the coffee.

/me sips of the holy beverage of wakefulness

-Mars
posted by Mars Saxman at 12:19 PM on June 11, 2000


My bad, I think. Sorry!
posted by grumblebee at 1:04 PM on June 11, 2000


there was something like this awhile back where a guy got a remote control implanted in his brain...of course it wasn't actually a remote control, and by that I mean it wasn't the object itself but it acted like one, so it was a coil that could let him change channels...I need a drink
posted by starduck at 1:30 PM on June 11, 2000


This doesn't do much to get us closer to Gibson's vision as far as I can tell. Similar devices (albeit outside the brain) have worked in a similar manner for years (hooking pointing devices up to EEGs, etc.) but being able to hook up to the whole cortex and relay sensations, add "language modules" and so on it still about as scifi as one can get. The difference isn't just one of degree; it is a difference of kind: in one case you are intercepting or adding a detector to a biological activity (you could do the same thing to the arm of an able-bodied person rather than an axon bundle and get the same effect) and in the other case you are integrating mental experience into a living consciousness.
posted by sylloge at 3:37 PM on June 11, 2000


Good point, sylloge, although I would still argue that this does bring us ONE step closer to a world in which there is a completely fluid relationship between human mind data and computer data. Isn't that a large part of Gibson's vision?

I'm also not sure that "living consciousness" is fundamentally different from anything else that goes on in the brain. This, of course, is The Big Debate that currently runs through the neuroscience community.
posted by grumblebee at 5:37 PM on June 11, 2000


One of my main motivations for getting rich is to able to afford the neural interface when it gets here. I'm guessing I have 20 years to get rich.
posted by y6y6y6 at 6:26 PM on June 11, 2000


I don't know -- I've spent almost 10 years thinking about it with varying degrees of devotion, but the question of whether consciousness goes on in the brain is still too hard for me. (Lately, I'm feeling that it goes on all over the place, a la the 21 year old lecture by Stephen Toulmin which I just posted on my site.)

It's clear that it is fundamentally different from any particular activity going on in any particular part of the brain at any particular time. "Consciousness" is not just one thing among many going on in the brain.

Visual processing is (relatively speaking) very well understood. You can trace the action as it passes from the rods and cones to the H cells, etc., down the optic nerve, to the chiasm, magnocellular and parvocellular still split at the hypothalamus, to the primary visual cortext, on to the striate cortex, all the 7 levels . . .

And you can come to an understanding of what individual simple, complex and hyper-complex cells are responding to, you can even map arrays of cells responding to different sorts of movement, or edges at different angles, and so on. And from there, the "signals" diffuse and carry on into the other lobes, but you can't say "this is where it becomes consciousness" or "this is what the subject is seeing" (pointing to a set of cells).

Even if you've read a couple of textbooks, neuroanatomy on the microscopic level is far too complex to grokk and since, at that level, it might as well be random (differing from individual to individual), we are never going to be in a position to develop a computer-brain interface which -- even if we could model the neural activity with sufficient accuracy -- would be able to relay that activity/information to the brain.

The classic analogy is this: aliens get a bunch of computers and have no idea how they work. They shoot each one in a different place with a laser and try to deduce the function of the parts by seeing what happened when they shot it there. Pretty hopeless in other words.

Course, you never know . . .
posted by sylloge at 6:29 PM on June 11, 2000


there's a world of difference between being able to operate a primitive digital interface and being able to absorb data from a digital source. I recall watching an episode of Nova on tv a long, long time ago, and they had wired a toy truck to an amputee's remaining primary musculo-skeletal nerve that had been responsible for gross movement, and the fella was able to drive the truck without much difficulty at all. However, the only method he had for avoiding obstacles was watching the truck with his eyes.
I'd love to be able to go into work, plug in, and just think my way to all the different hosts on my networks without having to {gasp! the labor involved!} open up terminals and actually ssh into them. But until they can get the data from the remote host back into me, I might as well do it by hand, since I need to stare at the screen anyway.
What would be really cool is if they could generate algorithms to represent the data [taking us back to Gibson again] and feed the signal either straight into the optical cortex [ideally] or into some kind of imaging apparatus to show the multidimensional, dynamic nature of a relational database, or a network, or any complex system. It would be really cool to be able to work with biochemistry, particle physics, etc in realtime. Eventually, I'm sure.
posted by katchomko at 12:38 AM on June 12, 2000


« Older Two-Faced Kitten Dies...  |  "Look, just make the damn drug... Newer »


This thread has been archived and is closed to new comments