Et tu, C3PO?
July 21, 2004 7:10 AM   Subscribe

The news this morning included a small blurb about today being the anniversary of the first human killed by a robot. No doubt this is important to note due to the release of I, Robot but it appears to be incorrect. The first incident I can find was on January 25, 1979. Since then OSHA has recorded at least 10 more deaths in US factories alone. Japan saw it's first in 1981 and as a result it's Ministry of Labor requested a 20% budget increase for its robot related activities. All these incidents can be classified as accidents but it does force me to wonder how dangerous they will be when AI advances further. Should we mandate the Three Laws of Robotics?
posted by jwells (22 comments total)
 
Should we mandate the Three Laws of Robotics?

We should all just get Old Glory Insurance.
posted by Mayor Curley at 7:13 AM on July 21, 2004


It's a nice idea, but it's probably impossible. Can anyone define "human being", or "harm" in a way that can be programmed?
posted by aeschenkarnos at 7:23 AM on July 21, 2004


Nice post.

Meanwhile, it looks like the movie is crap and an insult to Asimov and all he stood for. Even if some people just don't get it.

Violence is the last refuge of the incompetent.—Hari Seldon (Foundation, Isaac Asimov)

Corollary: Hollywood is full of incompetents.

posted by rushmc at 7:40 AM on July 21, 2004


the anniversary of the first human killed by a robot...
Robots don't kill people. Guns kill people.
posted by seanyboy at 7:50 AM on July 21, 2004


There's a point to that joke somewhere. We don't need the three laws until a robot is capable of making the kill/don't kill decision. The fact that somebody died whilst in the way of programmed behaviour has nothing to do with this decision.

As soon as a machine can be designed to be capable of detecting that it's doing harm to other people, then it's programmed to shut down / reverse if it detects that harm. There's no need for any fancy shmancy "laws".
posted by seanyboy at 7:58 AM on July 21, 2004


Violence is the last refuge of the incompetent. Hari Seldon Salvador Hardin (Foundation, Isaac Asimov)

Also on K5, where Cory Doctorow's article is discussed.
posted by MzB at 8:01 AM on July 21, 2004


Working in a factory can be dangerous work. A robot is just one more dangerous piece of machinery. The most dangerous job I ever worked was while paying my way through university. It was a Ford Motor Company research plant. They made a deal with the union, stay out of the plant and we won't move production of some parts to Mexico. The union stayed out and most safety measures were cast aside in favour of getting things done.

So one of my jobs was to sweep dross off of molten aluminum while perched on a narrow I-beam about 4 feet above this vat of molten aluminum. My safety equipment was jeans and a t-shirt (admittedly somebody did loop their finger through a belt loop the first time I did it just in case I wasn't up to the task, I suppose that if I weren't up to the task they'd at least be able to say "We tried to save him but his belt loop broke. Damn you Levi Strauss, Damn you to Hell!").

That was a dangerous job. The very most dangerous aspect was one particular very senior person at this plant who believed in speed much more than safety. He nearly gored my head with a forklift before our Christmas party and I had numerous other close calls. The final straw was when he actually ran over my foot with a forklift I was unloading because he needed it for a "quick job". Fortunately I had very expensive workboots on or my foot would've been paste. He was made safety officer less than an hour after this :P I quit.

Anyway, maybe people died to robots but if you look at any given industry that involves manual labour you're going to find death and maiming. Go process fish or meat for a few months. Go work in a press bay for a few months. Go work in a clothing factor for a while. Now pretend that whatever machinery is involved in the injury is somehow special. That it's new, novel or a sign of the future. That piece of machinery will stick out because you'll be able to say that "In the past 10 years a dozen people have been maimed or killed by the flamsteelingator. Is it time to consider laws against this machinery or increased scrutiny of it's operation?" while ignoring the statistics associated with other pieces of machinery.
posted by substrate at 8:14 AM on July 21, 2004




Actually, I saw the previews and told myself I wasn't going to see the movie. Then, I checked Rottentomatoes.com and was shocked to see that it had a 66% rating. That piqued my curiosity just enough and it turns out the movie wasn't as bad as I thought it would be. In some ways, I was entranced.

The movie started off absolutely terrible with the most gross display of embedded advertising I've EVER seen. I was almost ready to walk out of the theater, in fact. I think they got it out of the way early on, though, because after that I didn't notice any more grossly overt marketing (just the standard fare).

Anyhow, the movie got much better as it went on, except for some dorky slo-mo action sequences at the end. I thought the most interesting part was the robot Sonny and the issues about consciousness and what it means to be alive. It was quite thought-provoking toward the end and I think will be effective in making much of the population think about an issue that has been fairly sci-fi til now, but is slowing becoming a real-world issue.
posted by PigAlien at 8:43 AM on July 21, 2004


Damn, aeschenkarnos got there first.
posted by troutfishing at 8:45 AM on July 21, 2004


NED LUDD WAS RIGHT!
The machine IS the enemy.
Smash it without mercy!
Don't tell me technology is neutral. Every day I wander this city, and every day machines flash lights trying to tell me what to do. Huge tarmac pathways cross my way, upon which gigantic, speeding metal machines move, machines capable of killing me if I cross their path and already slowly suoffocating me with their toxic fumes which fill the air.
WHY SHOULD I TOLERATE THIS INSANITY?
NED LUDD WAS RIGHT!
posted by Shane at 8:53 AM on July 21, 2004


Robots would be better programmed to Isaac Hayes' Three Laws. Can you dig it?
posted by brownpau at 9:19 AM on July 21, 2004


this is crap. define 'robot'. workers have been being mangled in machinery since before the cotton gin. soldiers were killed by defective catapults in the dark ages. what's wrong with you people? how can you possibly beleive this 'anniversary' thing? you let hollywood say what history is? no wonder the bushies are finding it so easy to walk you down the path of ignorance!
posted by quonsar at 10:25 AM on July 21, 2004


Galley Slave, an early Asimov short story within the context of the Three Laws of Robotics had a robot harm a person for the better good of a corporation.

Not to mention that Asimov actually developed a fourth law, the 'zero' law of robotics which allows a robot to kill in the interest of 'The Good of Humanity', what level of programming will define subjective concepts of good and humanity?
posted by page404 at 11:01 AM on July 21, 2004


Please go stand by the stairs
So I can protect you

posted by Kwantsar at 12:46 PM on July 21, 2004


What quonsar said. Modern industrial robots are only incremental improvments on designs that existed 100s of years ago. Just instead of using cumbersome cams and followers we're using flexible electronics.

The problem with I, Robot is that is about as faithful to the book as Starship Troopers was.
posted by Mitheral at 2:36 PM on July 21, 2004


I don't care about the three laws -- when can I get my personal army of Alan Tudyks?
posted by Katemonkey at 2:40 PM on July 21, 2004


Hari Seldon Salvador Hardin

Is it? Been 20 years since I've read it and I don't have access to my copy atm to check, but if you say so... :)
posted by rushmc at 3:23 PM on July 21, 2004


I think the whole point of the three laws of robotics, as Asimov wrote them, was to explore their deeper meaning. What is the definition of good, bad, human and obey? Where do you draw the line between individual humans and humanity? These issues were explored in the books, for those who haven't read them. And, they were explored in the movie.
posted by PigAlien at 4:24 PM on July 21, 2004


I have to say this caught my eye:

Foundation, when it was published in Arabic in 1952, was translated as Al-Qaida.
From the Slate review rushmc linked above.
posted by inpHilltr8r at 4:59 PM on July 21, 2004


And, they were explored in the movie.

Also, running around blowing off robots' heads was explored. At length, from what I understand.
posted by rushmc at 7:13 PM on July 21, 2004


Got an email from a nonmember:

"Galley Slave, an early Asimov short story within the context of the Three Laws of Robotics had a robot harm a person for the better good of a corporation."

This is not true. The robot in "Galley Slave" was preparing to lie to protect the person who was accusing it (the robot) of changing the text of a book.
posted by jwells at 6:37 PM on July 22, 2004


« Older buying the sublime   |   Olive Garden, Through The Eyes of Italy Newer »


This thread has been archived and is closed to new comments