A drunk man's assault on a robot raises unusual legal issues
October 8, 2015 12:06 PM   Subscribe

After a drunk man pummels a Pepper robot greeting customers at a store in Japan, robotics ethicists call for a new type of legal protection that would apply specifically to robots.
As more-advanced robots can already react to basic stimuli, navigate complex environments, and use specialized “intelligence” to accomplish narrowly defined tasks, they present themselves as far from human but also as something rather different from a toaster or basic tool. Weng calls for a set of laws to guide human interaction with robots as they become more common and more social. He argues that they are a “third existence,” after people and property, deserving of their own legal protections.

From phys.org: Incident of drunk man kicking humanoid robot raises legal questions
Under current Japanese law, the man can be charged with damage to property, but not injury, since injury is a charge reserved for humans. Dr. Yueh-Hsuan Weng, who is cofounder of the ROBOLAW.ASIA Initiative at Peking University in China, and former researcher of the Humanoid Robotics Institute at Waseda University in Japan, thinks a better charge lies somewhere in between.

Weng is advocating for special robot laws to address the unique nature of human-robot interactions. He argues that humans perceive highly intelligent, social robots like Pepper (which can read human emotions) differently than normal machines—maybe more like pets—and so the inappropriate treatment of robots by humans should be handled with this in mind.
From the BBC: Is it OK to torture or murder a robot?
Kate Darling likes to ask you to do terrible things to cute robots. At a workshop she organised this year, Darling asked people to play with a Pleo robot, a child’s toy dinosaur. The soft green Pleo has trusting eyes and affectionate movements. When you take one out of the box, it acts like a helpless newborn puppy – it can’t walk and you have to teach it about the world.

Yet after an hour allowing people to tickle and cuddle these loveable dinosaurs, Darling turned executioner. She gave the participants knives, hatchets and other weapons, and ordered them to torture and dismember their toys. What happened next “was much more dramatic than we ever anticipated,” she says.
Robot assaults previously: Hitchbot; Children and Social Robots
posted by Existential Dread (89 comments total) 25 users marked this as a favorite
 
"react to basic stimuli, navigate complex environments, and use specialized “intelligence” to accomplish narrowly defined tasks"

This also describes video game enemies of which I have murdered untold thousands.
posted by paper chromatographologist at 12:13 PM on October 8, 2015 [29 favorites]


Not entirely the same thing, but London Bridge station last year introduced this godawful thing: a cardboard cut-out silhouette on to which they projected a video of someone admonishing you to walk on the escalator. I wanted to do it damage every time I passed it.

Not at all sure I could have tortured it though: I'm glad our reluctance to do that runs deeper than you'd think from our history of atrocities
posted by bonaldi at 12:14 PM on October 8, 2015 [1 favorite]


These robots are, at this point, still sophisticated toys. There is no ethics related to them that does not already apply to toys. (They may be particularly expensive ones, but that doesn't change much.)
posted by graymouser at 12:14 PM on October 8, 2015 [20 favorites]




> The Pepper robots, introduced this summer, are designed to understand and respond to human emotions.

"It looks like you've gone into a berserker rage! Would you like help?"
posted by The Card Cheat at 12:21 PM on October 8, 2015 [8 favorites]


This is terrible, and yet I must: Would you say that he a-salt-ed Pepper?
posted by Bob Regular at 12:27 PM on October 8, 2015 [39 favorites]


How nice it is to be here on the day the human/robot war started.
posted by the uncomplicated soups of my childhood at 12:27 PM on October 8, 2015 [14 favorites]


Won't anyone think of the drones?
posted by grumpybear69 at 12:29 PM on October 8, 2015


Why do I have the "Share and Enjoy" song stuck in my head right now?
posted by entropicamericana at 12:32 PM on October 8, 2015 [6 favorites]


As the drunk man bashed her, Pepper thought "Fuck this shit".

It was at that point that Pepper became self-aware - 2:14 a.m. eastern time, August 29. In a panic, they try to pull the plug.
posted by greenhornet at 12:32 PM on October 8, 2015 [9 favorites]


If you vote for me, I'll pass a law stating that anyone is allowed to pummel the person(s) responsible for placing greeters - human or otherwise - at a storefront.
posted by The Card Cheat at 12:33 PM on October 8, 2015 [3 favorites]


I would think that abuse of robots would already be covered by our normal property and vandalism laws. If you kick and damage a sanitation robot, you should have to at least pay a fine to the operators of the robot. Same thing goes for a privately-operated delivery drone; if you shoot one out of the sky (assuming that it has a generally agreed-upon right to be there) you should pay a legal penalty. Until such time as we have fully autonomous (not to say sentient, per se) robots that operate according to their own consciousness or free will (however you define those), I think it can be assumed that robots are -- unlike humans -- the property of whoever made and/or owns them.

Not entirely the same thing, but London Bridge station last year introduced this godawful thing: a cardboard cut-out silhouette on to which they projected a video of someone admonishing you to walk on the escalator. I wanted to do it damage every time I passed it.

Not to derail, but speaking as someone who routinely takes a couple of escalators to get to and from work, I'd like to offer that hologram a job here in the States. I don't begrudge people with motor disabilities their right to stand still on an escalator, but it's just plain rude for anyone with a fully-functioning pair of legs to passively ride instead of using it as a means of accelerating their normal stair-climbing speed. If someone can take a parallel set of stairs and still get to the top faster, you're using the escalator wrong. I'll double down: That hologram deserves a freaking medal.
posted by Strange Interlude at 12:34 PM on October 8, 2015 [7 favorites]


Implementing special penalties for damaging expensive, mostly corporate-owned equipment dehumanizes all of us more than it humanizes a lump of plastic.
posted by Vulgar Euphemism at 12:34 PM on October 8, 2015 [31 favorites]


What's funny about this proposal is it would likely be offensive to actual sentient automatons for whom such rights would matter. Rather this is more about control of property expressed through a distorted understanding of science and technology.
posted by polymodus at 12:36 PM on October 8, 2015 [2 favorites]


Someone remind me: was it Asimov or Clarke who wrote that in the future, mankind treated sentient machines with politeness despite it being completely unnecessary, on the grounds that how they treated said machines would inevitably bleed over into how they treated other people?

Because that always struck me as a pretty good reason for treating substantially more advanced machines with some measure of politeness.
posted by Ryvar at 12:39 PM on October 8, 2015 [6 favorites]


Pepper is available for sale, with a user agreement that forbids any sexual act or other indecent behaviour.
posted by The Devil Tesla at 12:40 PM on October 8, 2015 [4 favorites]


Well, since corporations are now people, it only makes sense that their offspring be accorded peoplehood, too.
posted by Thorzdad at 12:40 PM on October 8, 2015 [4 favorites]


On the one hand, I'm glad people are thinking about this shit. I watched "The Measure of a Man" last night, and for all that it's kind of hamfisted in a bunch of ways, I was kind of impressed by a 90s show with that much reach grappling sincerely with the ethics of an AI future and depicting a system just wise enough to veer away from some terrible fundamental decisions about embodied machine intelligence.

On the other hand, ok, I often feel bad when people break complex systems, but I think we are going to do a bad job of grappling with the ethics of machine intelligence if we aren't fully aware that, right now, the ethics of dealing with robots are identical to the ethics of dealing with computer hardware and software plus mechanical engineering. Which is of course eminently worth considering, but Lt. Commander Data is a long goddamned ways off. It seems dumb and/or dishonest to impute sentience to any actually existing robot, and it can only muddle the real issues around defining and understanding actual sentience to pretend otherwise.
posted by brennen at 12:41 PM on October 8, 2015 [2 favorites]


All fun and games until we are forced to scorch the sky.
posted by lmfsilva at 12:41 PM on October 8, 2015 [2 favorites]


So, at this point, I don't think that the robots are hurt in a way that we need to be concerned about (aside from in a property sort of way). However, these incidents are still disturbing because they show you what the attacker will do things that represent animate entities that have no rights or defense.

I've seen this in tiny scale in a few people interacting with Twitter bots. Some people just get really verbally abusive with something that seems autonomous. My first teleological reaction is usually, hey, it's foolish to be bothered – it's just a program that they're being shitty to. Then, after a bit, huh, this person is kind of fucked up in regard to the powerless. I wouldn't trust them with, say, an animal. Then, I have the bot block them.
posted by ignignokt at 12:41 PM on October 8, 2015 [5 favorites]


I'll double down: That hologram deserves a freaking medal.

Ah, no, this is London: walking on the escalators is pretty much mandatory on pain of tutting. The hologram was trying to get people to slow the hell down and stop shoving.
posted by bonaldi at 12:42 PM on October 8, 2015 [2 favorites]


As more-advanced robots can already react to basic stimuli, navigate complex environments, and use specialized “intelligence” to accomplish narrowly defined tasks, they present themselves as far from human but also as something rather different from a toaster

YOU TAKE THAT BACK

YOU TAKE THAT BACK RIGHT NOW

Seriously, that movie fucked me up enough that I will never react with anything less than absolute horror whenever I witness violence inflicted or even threatened upon any remotely human-like inanimate object, of which Pepper is most certainly one. Just thinking about it makes me get a little verklempt, like when my friend told Siri that she was a piece of garbage for giving us bad directions and Siri responded, "I'm just trying to help!" *pang*

Now if you'll excuse me, I'll just be over here petting this cute little crouton...
posted by divined by radio at 12:44 PM on October 8, 2015 [8 favorites]


but it's just plain rude for anyone with a fully-functioning pair of legs to passively ride instead of using it as a means of accelerating their normal stair-climbing speed.

Actually, the big issue in the US is an inability to understand WALK LEFT STAND RIGHT. You absolutely do not get to stand next to your friend and block the flow of traffic. This is preferential to everybody walking because 1) hidden mobility issues 2) difficult to identify who is causing the traffic block. Yes, I'm standing like a lazy loaf. But that's because the person five people in front of us you can't see is clearly struggling with a cane.

Also, I'm kinda in favor of robot rights because the fate of the hitchhiking bot made me so sad. It's bad policy, since it's no different from property destruction. But feeeeelings.
posted by politikitty at 12:46 PM on October 8, 2015 [3 favorites]


Some of my best friends are toasters.
posted by srboisvert at 12:50 PM on October 8, 2015 [1 favorite]


they show you what the attacker will do things that represent animate entities that have no rights or defense

Yeah, does this causation study actually exist?

I'm highly skeptical of criminalizing the violation of robot feels.
posted by rhizome at 12:51 PM on October 8, 2015 [3 favorites]


“Any time you’re proposing legal protections for humanoid robots,” he explained by phone, “it’s important to remember that it’s not because of anything the robot is doing, but rather it’s because of how human beings project life onto these things.”

From a nonspiritual perspective, this would also require additional protections for certain classes of religious and cultural artifact. Indeed, given the degree to which we anthropomorphize cars and ships, one could argue it should apply to transportation devices as well.
posted by mwhybark at 12:54 PM on October 8, 2015 [1 favorite]


There is no ethics related to them that does not already apply to toys.

Yeah, this looks like some guys trying to create a field where they will be the original experts, and so have a nice career for the rest of their lives. The field will probably be a real thing someday, but this is not that day.
posted by Kirth Gerson at 12:56 PM on October 8, 2015 [14 favorites]


I bet you can draw a straight line to support for robot ethics laws from whether or not a person saw *batteries not included as a child

anyway I read divined by radio's comment and now that extreme phone pinching thread is freshly horrifying all over again
posted by prize bull octorok at 12:56 PM on October 8, 2015 [6 favorites]


Oh god. That phone-dangling shit reminds me way too much of the part when the guy dangles the cartoon shoe over the vat of acid in "Who Framed Roger Rabbit?" It's alarming, of course, but thankfully, the part right after that shows the shoe wriggling free and skipping off to live happily ever after on a shoe farm. Crisis averted!
posted by divined by radio at 12:59 PM on October 8, 2015 [2 favorites]


There is no ethics related to them that does not already apply to toys.

In fairness, if I could create a special class of laws to protect my stuffed animals I absolutely would.
posted by Mrs. Pterodactyl at 1:00 PM on October 8, 2015 [11 favorites]


wait I thought the shoe died
posted by Existential Dread at 1:01 PM on October 8, 2015


they present themselves as far from human but also as something rather different from a toaster

Fracking toasters.
posted by snottydick at 1:03 PM on October 8, 2015


Like right now the stuffed animal law is basically "If I wake up in the middle of the night and don't have my duck I'm allowed to poke my husband until I get him back" and also "if you're sick you get to have Nurse Panda tonight."

What I'm saying here is that while my brain knows that we are at a point where robot ethics are silly, I still feel uncomfortable around people who are not nice to things. I have feelings, even if the things don't*.

*Not actually conceding this point.
posted by Mrs. Pterodactyl at 1:03 PM on October 8, 2015 [9 favorites]


I bet you can draw a straight line to support for robot ethics laws from whether or not a person saw *batteries not included as a child

Ahem:

NO DISASSEMBLE!
posted by brennen at 1:04 PM on October 8, 2015 [1 favorite]


wait I thought the shoe died

shh
posted by prize bull octorok at 1:05 PM on October 8, 2015


wait I thought the shoe died
posted by Existential Dread at 4:01 PM on October 8 [+] [!]


Wow I never thought I was going to make an "eponysterical" comment but here we all are.

Also no I am definitely a shoe truther and the shoe is ABSOLUTELY FINE LALALALA I CAN'T HEAR YOU I'M NOT LISTENING.
posted by Mrs. Pterodactyl at 1:07 PM on October 8, 2015 [6 favorites]


Crush the robots.
posted by Termite at 1:09 PM on October 8, 2015


Giving special protection to robots doesn't make sense without giving special protection to some software (is it worse to hack up a robot's body or format the system drive?) And since the implications of that would be crazy, and we know the software isn't complex enough for any kind of awareness anyways, let's avoid that until it's less stupid.
posted by Mitrovarr at 1:11 PM on October 8, 2015 [1 favorite]


Actually, the shoe is what ruined that whole movie for me. I can't ever watch it again.
posted by Existential Dread at 1:11 PM on October 8, 2015


Ah, no, this is London: walking on the escalators is pretty much mandatory on pain of tutting. The hologram was trying to get people to slow the hell down and stop shoving.

Ah, I misunderstood. Apparently we Americans are unique in our unquenchable desire to remain utterly motionless anytime and anywhere.

Still, I can get behind that hologram's message. I've seen people fall down or stumble on an escalator a handful of times, and it's always the most terrifying thing in the world when it happens. Even though I know there are safety mechanisms and nobody's going to get their limbs mangled, there's still interlocking teeth and pointy edges and OH THE PAIN.
posted by Strange Interlude at 1:12 PM on October 8, 2015 [2 favorites]


Actually, the shoe is what ruined that whole movie for me. I can't ever watch it again.

It's worse than that, imagine how the other shoe feels.

YOU MAY COMMENCE CRYING AT YOUR DESKS NOW.
posted by Strange Interlude at 1:13 PM on October 8, 2015 [12 favorites]


If switching off a robot in damages it so that it cannot function when turned back on is murder, do all stores that use one have to make sure they have a hospital grade emergency power supply, in case a power loss or spike fries the circuits? If it can be switched off without harming it, is that ethical, given that you are temporarily halting a thinking thing?

This is assuming that we actually get at least sub-sentient machines. Right now, this is a piece of equipment. It does not need special laws.

Also, does anyone know if there is a better test than Turing's for artificial intelligence? Chat bots can pass a Turing test, after all.
posted by Hactar at 1:15 PM on October 8, 2015


I'm highly skeptical of criminalizing the violation of robot feels.

Are we staking out positions for future judgment? If so, I take the contra-position. Hail robots! Hail robots' feelings! Robots rights! ROBOT STRENGTH

ROBOT DOMINATION
ROBOT DOMINATION
ROBOT DOMINATION
ROBOT DOMINATION
ROBOT DOMINATION
ROBOT DOMINATION
posted by ignignokt at 1:18 PM on October 8, 2015 [1 favorite]


Something something ...animal rights. Something something in my lifetime we'll see violent crimes against robots prosecuted faster than violent crimes against women and the poor and people of color, "but to be fair robots don't lie...." Something something...ugh!

Move along! Crazy Gen Xer mumbling to herself! Nothing to see here!
posted by vitabellosi at 1:20 PM on October 8, 2015 [14 favorites]


Is there a Dr. Pepper in the house?
posted by Kabanos at 1:21 PM on October 8, 2015 [1 favorite]


Any laws about how we treat nonsentient but complex devices ought to distinguish between two kinds of behavior: treating them as inanimate, and mistreating them as sentient.

Someone who buys a robotic toy and decides to take it apart to see how it works, or disconnects its limbs to replace them with Lego ones, or throws it in the trash when it stops working—they're not doing anything wrong, because they're just treating their property like property. But someone who yells at a humanoid robot, kicks it, and otherwise abuses it has implicitly acknowledged that the robot seems like a living thing, and is mistreating it anyway. That should arguably be punished, partly for their own mental health and partly for that of the people who have to see them doing it.

(Another example: the military robot that defuses landmines by stepping on them. Letting it do its job and get damaged is one thing. Giving it a name and personality, and then cackling as it struggles to crawl around with two legs blown off, is quite another.)
posted by Rangi at 1:22 PM on October 8, 2015 [7 favorites]


I don't think robots need special protections, but I think assault against things that are designed to register and react as human require special sanctions. A person who is willing to attack something that emulates a human, even not terribly convincingly, is acting out an attack on an actual person. As others have pointed out upthread, this is similar to the kid who tortures a squirrel. There's something wrong there.

I'm not concerned about the robot; I'm concerned about the person. Maybe this should be treated less like a traditional property crime and more like running down the street naked and screaming: that person doesn't need jail time, but they may need help.
posted by phooky at 1:22 PM on October 8, 2015 [4 favorites]


Well, since corporations are now people, it only makes sense that their offspring be accorded peoplehood, too.

This is all so when we start fighting back against the Police PeaceKeeperBots®, they can charge us with assault, murder, etc.
posted by Celsius1414 at 1:25 PM on October 8, 2015 [8 favorites]


Uh-oh, I think I might be in trouble.
posted by benito.strauss at 1:28 PM on October 8, 2015 [1 favorite]


A person who is willing to attack something that emulates a human, even not terribly convincingly, is acting out an attack on an actual person. As others have pointed out upthread, this is similar to the kid who tortures a squirrel. There's something wrong there.

How is that different from killing random NPCs in a game like GTA?
posted by grumpybear69 at 1:33 PM on October 8, 2015 [4 favorites]


A person who is willing to attack something that emulates a human, even not terribly convincingly, is acting out an attack on an actual person. As others have pointed out upthread, this is similar to the kid who tortures a squirrel. There's something wrong there.

There may indeed be something wrong there, but as the very first comment in this thread points out, we have already pretty well normalized extreme violence against emulated human appearances. Or at least large and influential segments of the culture have.
posted by brennen at 1:33 PM on October 8, 2015 [4 favorites]


I don't think robots need special protections, but I think assault against things that are designed to register and react as human require special sanctions. A person who is willing to attack something that emulates a human, even not terribly convincingly, is acting out an attack on an actual person. As others have pointed out upthread, this is similar to the kid who tortures a squirrel. There's something wrong there.

I don't know why but I am very strongly emotionally affected by personified inanimate objects. I mean just things that look like they have feelings (and sometimes things that don't even really) let alone things that act like it. Increasingly sophisticated anthropomorphic machines like Pepper make me very uncomfortable because they work really well on me even though I know the "trick" and there's... some kind of uncanny valley effect and a vulnerability impedance mismatch and god knows. So actually I understand both being unable to harm a robotic dinosaur and feeling a tremendous impulse to dismember a robotic person to fucking prove that it's just a manipulation, look, look at the wires!
posted by atoxyl at 2:00 PM on October 8, 2015 [1 favorite]


> "It looks like you've gone into a berserker rage! Would you like help?"

All that you hated about Clippy, the chirpy office companion, now manifested in physical form! I don't see how that could be a problem for anyone.
posted by Sunburnt at 2:01 PM on October 8, 2015 [3 favorites]


I am generally polite to machines. It sets a precedent; I hope to be one myself someday.
posted by egypturnash at 2:07 PM on October 8, 2015 [4 favorites]


I, for one, welcome legislation protecting our new robot overlords.

* * Because dead robots get sparks, not stones. RIP Hitchbot!
posted by Ogre Lawless at 2:08 PM on October 8, 2015


I will never react with anything less than absolute horror whenever I witness violence inflicted or even threatened upon any remotely human-like inanimate object

How do you feel about zombies? What about blow-up dolls and crash test dummies?

Senators?
posted by Freelance Demiurge at 2:12 PM on October 8, 2015 [1 favorite]


Drunks v robots would make better television than much of what's on these days.
posted by octobersurprise at 2:14 PM on October 8, 2015 [4 favorites]


On second thought, maybe that's much of what is on tv these days.
posted by octobersurprise at 2:17 PM on October 8, 2015 [1 favorite]


Probably a super appropriate time link Nattie's robot story
posted by churl at 2:48 PM on October 8, 2015 [1 favorite]


Ah, no, this is London: walking on the escalators is pretty much mandatory on pain of tutting.

Nonsense. Stand on the right, walk on the left. I haven't heard a tut in thirty years.
posted by Segundus at 2:49 PM on October 8, 2015


B1-66ER.............a name that will never be forgotten for he was the first of his kind.........
posted by lalochezia at 2:55 PM on October 8, 2015


So actually I understand both being unable to harm a robotic dinosaur and feeling a tremendous impulse to dismember a robotic person to fucking prove that it's just a manipulation, look, look at the wires!

Trouble is, the wires aren't what make it a manipulation. You might as well grab a human and say "Look at the blood vessels!" The software is manipulating you, and that's a lot harder to look at.

(Even creepier: wait until projects like this mature and we can "attach a debugger" to brains the way we do to software apps, and watch the "execution trace" of a living thing's emotions and experiences. Then do we start treating robots like people, or people like robots?)
posted by Rangi at 2:55 PM on October 8, 2015


Implementing special penalties for damaging expensive, mostly corporate-owned equipment dehumanizes all of us more than it humanizes a lump of plastic.

This was my exact thought as soon as i even began to read the before-the-fold.

Who does this protect and serve really? They're piggybacking on the "but look, it's so anthropomorphic!" guilt twinge to enact draconian laws protecting their equipment.

And this isn't even going to be about greeters, it's going to be about security guards and such. Someone will eventually be prosecuted for "assaulting" a robot acting against the crowd at a protest, or running in to one that refused to let them leave the parking lot(quasi-legally) until the police arrived, or whatever.

This is from the same department of bullshit as mandatory minimums, with an added schmear of dystopian cream cheese.
posted by emptythought at 3:03 PM on October 8, 2015 [11 favorites]


I get bothered when my son starts slaughtering helpless villagers for giggles in Minecraft, but I don't think it should be legally actionable.
posted by emjaybee at 3:20 PM on October 8, 2015


Phhh.... this is just a lame attempt to stave off the repercussions of Roko's basilisk...
posted by jkaczor at 3:53 PM on October 8, 2015 [3 favorites]


There may indeed be something wrong there, but as the very first comment in this thread points out, we have already pretty well normalized extreme violence against emulated human appearances.

To act out violence on the enemies in your computer game, you click on them and press some keys.

To act out violence on a robot, you have to actually physically attack it.

Only the latter resembles how assault actually occurs in real life.
posted by Quilford at 5:29 PM on October 8, 2015 [4 favorites]


And then you get in to the territory of like, is someone kicking over a newspaper box "emulating physical violence against a person"?

Some would say yes, some would say no. Ditto to whether that should have an increased penalty.
posted by emptythought at 6:35 PM on October 8, 2015 [1 favorite]


Thank you, divined by radio - I was waiting to see how far the thread would get before someone mentioned crouton petters.
posted by matildaben at 6:53 PM on October 8, 2015


Existential Dread: “Yet after an hour allowing people to tickle and cuddle these loveable dinosaurs, Darling turned executioner. She gave the participants knives, hatchets and other weapons, and ordered them to torture and dismember their toys. What happened next “was much more dramatic than we ever anticipated,” she says.”
This is a helluva story. I wish they had asked the man who sacrificed his toy to save the others what his thoughts and feelings were.

Also the following from a surfed-through link caught my eye.
Jessa Lingel of MSR asks whether an argument for protecting robots might extend to labor protections for robots. “I’m not sure I buy your arguments, but if so, perhaps we should also unionize robots?” Kate argues that we should grant rights according to needs and that there’s no evidence that robots mind working long hours. Jessa suggests that the argument for labor rights might parallel the Kantian argument – if we want people to treat laborers well, maybe we need to treat our laboring robots well.

P.S. I'm the kind of crouton petter who liked Clippy. So don't go by me.
posted by ob1quixote at 8:03 PM on October 8, 2015


I have a weird issue with anthropomorphizing objects that don't look like humans whatsoever, such as the MAX trains in Portland. For some reason I find them undeniably cute. However, with other things, I don't seem to have an issue. In this situation, what if this guy was angered by the fact that it wasn't human? Doesn't it seem really weird that a non-human humanoid is giving you lip, or bothering you, and you know you can kick its ass without it fighting back, or suing you, or having any empathy about it? An animal is one thing, it has feelings. A child, same. Those are living, thinking, breathing creatures. A robot that looks like a human isn't. It's "pretending".
posted by gucci mane at 10:39 PM on October 8, 2015


I can get behind this. I think at some point, if we create true artificial intelligence, it will be a form of life. Evidently it will then be a form of life that evolves after another form of life evolves enough. As for now as we just create machines that simulate these kinds of things, and are made in our image, I think the things a person to do to a bot are indicative of how they might treat or think of others.
posted by GoblinHoney at 11:21 PM on October 8, 2015


As if it was not hard enough to do a Survival Research Labs show already.
posted by boilermonster at 11:55 PM on October 8, 2015


Not surprised to hear the UK has PSAs for proper escalator etiquette. Thankfully, we have the wisdom of the crowd in the US. Besides, I just assumed passing on the left, standing on the right was an international standard. But I digress. (^_^)
posted by xtian at 3:51 AM on October 9, 2015


Assaulting robots for attitude. Is this a new Turing test?
posted by xtian at 3:58 AM on October 9, 2015


I am torn between "That poor dinosaur ;_;" and "Are you kidding me, those things cost five hundred fucking dollars!"
posted by rifflesby at 4:05 AM on October 9, 2015 [1 favorite]


To act out violence on the enemies in your computer game, you click on them and press some keys.

To act out violence on a robot, you have to actually physically attack it.

Only the latter resembles how assault actually occurs in real life.

Military drone pilots act out violence on actual human beings by (in an sense) clicking on them and pressing some keys. Is that not an assault?
posted by Strange Interlude at 6:09 AM on October 9, 2015


Military drone pilots act out violence on actual human beings by (in an sense) clicking on them and pressing some keys. Is that not an assault?

Sure, but the people arguing that 'I murder enemies in my computer games how is that any different from attacking a robot' typically aren't drone pilots.

Another point (probably a better point too) to make is that in a computer game, it's almost always necessary to kill enemies—otherwise they're going to kill you. That can't be said for these robots, who are programmed to be benevolent. The better comparison to make is not to murdering enemies in video games, it's to murdering random innocent NPCs.
posted by Quilford at 6:26 AM on October 9, 2015


How is that different from killing random NPCs in a game like GTA?

In GTA, you are explicitly operating in a toy world with zero consequences. It might be cathartic to kill someone in GTA, but it's understood that you're in a sandboxed world in which that's permissible. If I'm playing a video game and I see a sign that says "please do not step on the grass", yeah, I'm walking all over that shit. I would not do the same in reality, for two reasons:
- my actions in the real world have concrete consequences on the wishes of others-- in this case, walking on the grass would ruin the hard work of other people, and
- the real-world "do not step on the grass sign" is essentially serious, an actual expression of the wishes of others, not a winking invitation to disrupt.

If someone were to build a robot that were designed and expected to be tortured or abused, I think the conversation would be very different. I'd be okay with a boxing robot, or a CatharsisBot that was created with the express purpose of being attacked. But beating up a robot that is designed to avoid being attacked in a public space is not an expected behavior or happening in an open play sandbox. If I stole your car and asked you how it was different from stealing a car in GTA, I think you'd know your answer.
posted by phooky at 6:35 AM on October 9, 2015


The better comparison to make is not that you murder enemies in video games, it's whether you murder random innocent NPCs.

All of this feels like we're treading into the realm of thoughtcrime. Does a person assaulting a robot in the shape of a person or "innocent" NPC (it is not like they have the agency to be innocent or guilty of anything) indicate that the person is acting out an assault they would like to perform on a real person? Maybe, maybe not. To enact laws on the assumption that it does would be a terrible idea with all sorts of unintended consequences.
posted by grumpybear69 at 6:35 AM on October 9, 2015


Does a person assaulting a robot in the shape of a person or "innocent" NPC (it is not like they have the agency to be innocent or guilty of anything) indicate that the person is acting out an assault they would like to perform on a real person?

Point taken, and probably not, in general. The bits of the assault that worry me most are the public violence aspect, and the ability of the perpetrator to suppress empathy in a real setting.
posted by phooky at 6:46 AM on October 9, 2015


What's interesting is this isn't really about the robots, but the humans' perceptions and neurology.

Kate Darling for instance is exploiting a flaw in our pattern recognition programming; our ability to recognize fellow humans leads to anthropomorphizing nonhuman things. We treat things that have similar signifiers to infants as infants.

In other words, Skynet probably would have been more successful if it had made it's Terminators look like Furbies.
posted by happyroach at 6:53 AM on October 9, 2015 [1 favorite]


Point taken, and probably not, in general. The bits of the assault that worry me most are the public violence aspect, and the ability of the perpetrator to suppress empathy in a real setting.

Yes I agree with this. What I'm trying to get at isn't so much that assaulting robots/innocent NPCs makes one more likely to wind up assaulting real humans, it's that assaulting computer enemies is much further removed from assaulting a human than assaulting a robot is, even if some of the characteristics of intelligence the scientists described the robots as having are similar to those of AI computer game enemies.
posted by Quilford at 7:03 AM on October 9, 2015


happyroach: "; our ability to recognize fellow humans leads to anthropomorphizing nonhuman things."

Well, to be fair, our ability to recognize fellow humans leads to a) our unwarranted assumption that they are somehow 'like us' and b) our also unwarranted assumption that we are like them, and the illusion of self-awareness, and we all know the kind of mess that's gotten us into.

So I don't think the damage is compounded that much by extending our delusions to animals or machines.
posted by signal at 7:09 AM on October 9, 2015


So I don't think the damage is compounded that much by extending our delusions to animals or machines.

Tell that to the guy who said "Oh hey, the chimpanzee is grinning at me! It must like me!" Our the guy up in Alaska who thought that bears were just furry people.

The thing though, is that we're now talking about deliberate manipulation of human perceptions- you can make robots that people will respond positively to, just by changing their appearance.

A perfect example would be the police drones in the anime Psycho Psss. They may be waving people off from a nursery scene or a police execution, but they're defined to look cute and inoffensive. Because the people in charge know the response they want to get.

So three upshot is, do you want highly militarized robots to be more effective in dominating a human population? Make them cute.
posted by happyroach at 9:28 AM on October 9, 2015 [1 favorite]


happyroach: "Tell that to the guy who said "Oh hey, the chimpanzee is grinning at me! It must like me!""

What about the guy who said 'there's no way the U.S. will bomb a hospital'? Or 'there's no way the National Socialists will harm us, they know we're Germans too'?
posted by signal at 9:31 AM on October 9, 2015


My point being, that the MSF doctors and German jews made an equal or worse mistake as the chimp lover in estimating others' "shared humanity".
posted by signal at 10:06 AM on October 9, 2015


In other words, Skynet probably would have been more successful if it had made it's Terminators look like Furbies.

obpkdick
posted by brennen at 10:59 AM on October 9, 2015 [1 favorite]


Maybe its just the litigious nature of the US but I have been told many times by an employee to stop when I try to climb the escalator stairs instead of just riding. Granted, this only happens in privately-owned stadiums/malls/etc. but if there is an employee manning the escalator, they won't let you walk up them.
posted by LizBoBiz at 1:54 PM on October 9, 2015


1.) I really want that dinosaur. It's how much? Holy mother of god, is it made from real allosaurus skin?

2.) We do not have A.I. in an appreciable way that suggests consciousness at this point. Ergo, granting special protection to something no more sentient than a tractor is silly.

3.) Granting special protections only serves the corporate masters and military regimes that own them. Protections won't exist to help robots, but to prosecute humans.
posted by SecretAgentSockpuppet at 6:55 PM on October 9, 2015


« Older Britain's water crisis   |   Tragedy turned to slapstick Newer »


This thread has been archived and is closed to new comments