Stop the killer robots before it's too late!
April 27, 2013 11:50 AM   Subscribe

Nobel laureate's campaign calls for pre-emptive ban on autonomous weapons. As our technology advances, it becomes more and more feasible to give more and more autonomy to our drones. A new campaign led by 1997 Nobel laureate Jody Williams calls for an international ban on the design of autonomous weaponized drones.

The future often arrives from directions you didn't expect. Some sci-fi writers of the past expected Moon colonies and manned missions to Mars by now, but had no inkling that the world would be revolutionized by a global network of devices thousands of times more powerful than the room-sized computers of the day. We've both disappointed and exceeded expectations. Asimov's robots never came to be, but today flying robots that are (in different ways) both less and more ambitious are in the early stages of becoming a part of everyday life, mostly for imaging and surveillance.

The FAA currently bans commercial use of drones in the US, but that will soon change:
In five years, experts predict, more than 10,000 drones will be working overhead for American businesses. Some say the number might soar as high as 30,000. That’s a lot of cameras staring down, some with infrared imaging, swiveling to see ever more.

Every day advancements are made in the technology. As the machines become more weather-proof, with longer battery life, lighter, smaller, even bug-sized, the list of possible uses — and concerns — grows.
Drones have become an essential part of America's way of war.
an overwhelming reliance on killing terrorism suspects, which began in the administration of George W. Bush, has defined the Obama years. Since Mr. Obama took office, the C.I.A. and military have killed about 3,000 people in counterterrorist strikes in Pakistan, Yemen and Somalia, mostly using drones. Only a handful have been caught and brought to this country; an unknown number have been imprisoned by other countries with intelligence and other support from the United States.


(A visualization of the drone war in Pakistan, previously linked on the Blue)

This is all scary-sounding, but drones have many sensible and peaceful uses. Mesa County, Colorado recently cut the cost of an annual landfill survey from $10,000 to $200 by using a drone instead of a piloted craft.

We're not yet at a stage where automated weapons are feasible, but we will be within a couple decades. Should we ban them pre-emptively? Or is that just Luddite thinking?
posted by Sleeper (122 comments total) 17 users marked this as a favorite
 
Oh yes, and another peaceful but possibly not sensible use of drone technology has already been attempted: In 2011 the TacoCopter would have delivered fast food via drone if the FAA ban hadn't been in place.
posted by Sleeper at 11:57 AM on April 27, 2013 [8 favorites]


The post conflates autonomous killing machines and drones, which are emphatically not the same thing. Drones are advanced remote-controlled airplanes; they are by no means autonomous. Humans do all the thinking for them.

In terms of banning autonomous killing machines, I'd really like to see a definition of what that term means. The article mentions "machines with the ability to attack targets without any human intervention", which seems much too vague to be able to get any traction. That definition would seem to encompass such things as land mines and even hand grenades rigged with a trip wire; we'd need to have a better definition in place in order to implement a ban.

Once it's been better defined, it's worth considering, certainly. I can, however, predict that it's going to be a difficult sell. If a country can send mechanized units in place of soldiers, won't the population put pressure on them to do just that, in the name of the sanctity of human life? Isn't it reasonable to ask that the military build robots, who nobody will mourn, to go out and do the fighting?

It would suck for the people the robots were attacking, of course, but... is it really better to be killed by a human being than by a robot? Robots will make mistakes in the field and kill civilians, of course, but human beings do that all the time too. We can't program human beings to avoid those mistakes in the future, but we can upgrade the robots' software to take circumstances into account. They don't get tired after sixteen-hour patrols and stop thinking, they don't get homesick and sloppy, they don't get enraged after seeing their fellow robots blown up, so maybe they'll even make fewer mistakes...

If a robot kills a person, there's a person dead. If a human soldier kills a person, there's a person dead, and a person who has to live with having killed someone.

And robots aren't hard to build. Your cell phone probably has enough computing power to make a decent turret. Even if we ban robots, insurgencies and rebel states throughout the world are going to use them to supplement their manpower wherever possible. Look at the amazing improvised weapons in Syria and Libya; do you really think that a talented roboticist wouldn't have found a useful niche in their organizations?

The arguments are easy enough that it's impossible for me to believe that we won't see autonomous robot warriors in the next decade. We're going to have to figure out how to live in a future that involves robots making war. A ban is a lovely gesture towards a simpler future, but it's almost impossible to image it succeeding.
posted by MrVisible at 12:12 PM on April 27, 2013 [17 favorites]


(addendum to my previous comment: "And also if TacoCopter hadn't been just a perfectly silly idea they weren't serious about.")
posted by Sleeper at 12:12 PM on April 27, 2013


I mean, I'm biased, because I'm a robotics nerd myself and I work for a company that serves the defense industry with autonomous (underwater, unarmed) vehicles, though I feel just as uncomfortable with the idea of an unmanned army of killer robots raining death and destruction from the skies as the next guy. But I guess from the engineering/design standpoint, I have to ask: what precisely is the goal of "banning autonomous weapons"?

Is it to keep warfare from being too "easy" for a government to wage?

Is it to protect the welfare of citizens from the tyranny of their own government?

Is it to make a statement about the liability of a decision to kill, or the responsibility for wrongful death?

Is it to reduce the likelihood of the death of innocents?

Is it to ensure that enemy combatants are dealt with under particular rules of engagement, or given due process if applicable, rather than a Hellfire missile without trial?

Because I don't think the solution to any of these problems lies in technology (or the lack thereof).
posted by olinerd at 12:12 PM on April 27, 2013 [4 favorites]


If you're smart, like me, you'll buy yourself an Old Glory insurance policy.
posted by Ghostride The Whip at 12:15 PM on April 27, 2013 [9 favorites]


what precisely is the goal of "banning autonomous weapons"?

avoiding the dark future of Terminator?
posted by neuromodulator at 12:19 PM on April 27, 2013 [7 favorites]


what precisely is the goal of "banning autonomous weapons"?

avoiding the dark future of Terminator?


The plot of Terminator, as well as most "kill all humans" sci fi, usually involves the robots/intelligent machines/intelligent software having a reason to kill the humans. Like, we'd been enslaving them, or we're harmful to the Earth, or something like that. So yet again, a problem that doesn't need a technical solution -- it needs humans as a race to stop being dicks.
posted by olinerd at 12:21 PM on April 27, 2013 [3 favorites]


... and possible addendum: maybe not programming our own dickishness and megalomania into our AIs. That may have been something of the Terminator problem as well. So maybe I lied. Maybe there is a small technical contribution to avoidance of the robot apocalypse.
posted by olinerd at 12:22 PM on April 27, 2013 [2 favorites]


If a robot kills a person, there's a person dead. If a human soldier kills a person, there's a person dead, and a person who has to live with having killed someone.

There is, if you'll notice, a large difference there that I would argue is incredibly important. The costs of warfare and killing are not and never have been purely economic, both on a macro and micro level. Decreasing the cost of war's more unpleasant aspects increases the ease of its prosecution and eliminates the variability of its experience. A robot probably isn't programmed for mercy, nor will it sometimes shoot warning shots, or complain to its commander afterward. A robot doesn't return home with a greater understanding of the value of human lives, or agigtate against the horrors of war, or even live out its life in a community that everyday must now face the reality of warfare and its aftermaths rather than digest and forget the constructed narratives of its justifications.
posted by Chipmazing at 12:23 PM on April 27, 2013 [16 favorites]


At some point, I'd like to believe that humanity will take a step back and say "OK, let's pull back military and private technology, and let's allow the civilian sector drive the advances and ensure they're put to peaceful and beneficial uses."

Problem is (and there's a term or phrase that describes this... I can't think of it, right now) there's this fear that "If we don't keep up, the bad guys will pursue this, and subject us to it's horrors."

But at what point do civilizations at a large scale start saying "Let's stop risking humanity to answer the whims of certain humans..."? Is it possible that resources have become so centralized, the misers want this automated "defense" mechanism because it's becoming increasingly difficult to trust subordinates with their protection?
posted by Bathtub Bobsled at 12:26 PM on April 27, 2013 [1 favorite]


it needs humans as a race to stop being dicks.

Good luck with that.
posted by EndsOfInvention at 12:28 PM on April 27, 2013 [3 favorites]


So, Chipmazing, are you saying there's value to putting a human being through the ordeal of killing someone? That if someone dies, a human being has to pay the emotional toll of causing that death? Even if the person is put into the role of killer involuntarily, by being drafted? You're saying that if a country decides that it needs people dead, the soldiers of that country need to pay the terrible price of having killed a fellow human being, or killing people isn't right?

Because that just seems all kinds of wrong to me. I'd rather spare the soldiers the incredible psychological damage that goes with having killed a human being.
posted by MrVisible at 12:29 PM on April 27, 2013 [1 favorite]


We're not yet at a stage where automated weapons are feasible

We've been there for decades. Consider air-to-air missiles. A heat seeking missile is an autonomous flying robot that decides on a flight path based on sensor input. Defining "killer robot" in a way that excludes them is hard.
posted by justsomebodythatyouusedtoknow at 12:29 PM on April 27, 2013 [4 favorites]




"OK, let's pull back military and private technology, and let's allow the civilian sector drive the advances and ensure they're put to peaceful and beneficial uses."

Post WWII USA, this was called the automotive industry: massive capital expenditures in an extremely wide variety peripheral tech. It was not ideal.
posted by klarck at 12:34 PM on April 27, 2013


DirtyOldTown, Ghostride The Whip beat you to it.
posted by Sleeper at 12:35 PM on April 27, 2013


A robot probably isn't programmed for mercy, nor will it sometimes shoot warning shots, or complain to its commander afterward.

Unless you program them for that.

A robot doesn't return home with a greater understanding of the value of human lives, or agigtate against the horrors of war, or even live out its life in a community that everyday must now face the reality of warfare and its aftermaths rather than digest and forget the constructed narratives of its justifications.

I think that's a terrible thing to ask of people who've already been through a terrible ordeal. Relying on our veterans to talk us out of future wars hasn't work out very well so far. And given the choice between having someone killed by a robot, and having to have a community deal with the aftereffects of its sons having killed and been killed in a war, it seems like any humane community would choose the former.
posted by MrVisible at 12:37 PM on April 27, 2013 [1 favorite]


Ghostride The Whip beat you to it.

Damn. Sure did.
posted by DirtyOldTown at 12:39 PM on April 27, 2013


And given the choice between having someone killed by a robot, and having to have a community deal with the aftereffects of its sons having killed and been killed in a war, it seems like any humane community would choose the former.

So, yeah, let's just automate the killing and keep on friending neighbors on Facebook! All nice and neat. No reason for a society waging war on others to, you know, have any fucking clue what horrors are actually being perpetrated in their name. They might want to stop it or something.

Any humane community wouldn't be waging war in the first place.
posted by Thorzdad at 12:42 PM on April 27, 2013 [6 favorites]


MrVisible, I'm saying that, if states are expressions of a People's collective will (which I don't completely agree with, but for the sake of argument) and it decides it needs to kill other people for a reason, there needs to be costs and constraints on that. There is a value, in non-conscription situations, in maintaining humans as an integral part of the destructive process. The emotional toll is just that, a toll -- a charge paid for the right to usage -- a toll that someone decided is worth it.

Drafts are awful, the existence of the poverty draft for our professional military is a travesty, but killing cannot continue to be further reduced in the mind of those doing the killing. The more abstracted warfare becomes, the further we get from its realities, reducing it to casual numbers in a spreadsheet or words-stripped-of-context like "children killed at a wedding". We begin to traffic in sad phrases instead of confronting the savagery or cruelty of our actions. The horror of automated distance killing is that there is no real, empathetic witness beyond grainy green camera footage. The psychological damage to soldiers can be spared by decreasing the atrocities of war, not increasing the atrocities but decreasing the survivors who have seen it.
posted by Chipmazing at 12:43 PM on April 27, 2013 [2 favorites]


I think its worth keeping in mind that war crimes and war criminals are condemned for their inhumanity. Literally making war easier and more inhuman seems like a step or ten backwards.
posted by Chipmazing at 12:46 PM on April 27, 2013 [8 favorites]


...
Is it to reduce the likelihood of the death of innocents?
..
Because I don't think the solution to any of these problems lies in technology (or the lack thereof).


Well, actually it's a near certainty that at some point an autonomous weapon will have buggy software and let loose on the wrong people. Much in the way automated manufacturing machines wound/kill a certain number of people a year.

So yeah, keeping humans in the loop will probably reduce the likelihood of death to innocents.
posted by Tell Me No Lies at 12:48 PM on April 27, 2013 [2 favorites]


Unless you program them for that.

Out of pure curiosity, open question, is this currently a possibility? What is the mercy algorithm? Does benevolence become a stochastic process where x amount of targets trigger a life-sparing protocol? I don't have the knowledge to parse how one would design compassion, identification, or clemency acts into a killing instrument, but would be mollified greatly if this was a thinkable/doable possibility.
posted by Chipmazing at 12:52 PM on April 27, 2013 [2 favorites]


Seems like pretty wooly thinking to me. If I use a claymore mine to guard my camp's perimeter, it explodes without my agency when someone trips the wire. A claymore isn't a landmine, but a mine that gets strapped to a tree with a tripwire, so it will explode and warn the camp / kill an intruder.

How can you make a rule that says a weapon is acting autonomously when controlled by electricity, but not when controlled by string?
posted by jenkinsEar at 12:55 PM on April 27, 2013 [5 favorites]


I'm saying that, if states are expressions of a People's collective will (which I don't completely agree with, but for the sake of argument) and it decides it needs to kill other people for a reason, there needs to be costs and constraints on that.

My problem with the current system is that the people paying those costs are the poor, the disenfranchised, the children of the powerless. The powerful people who make the decision to go to war don't go to war, and don't send their children.

If the horrors of war were able to end war, or even curtail it, then we would have ceased and desisted after World War I. Remember, it was the War to End All Wars.

So, why inflict PTSD, and worse, on our soliders, simply for carrying out their orders? If there's a way to avoid that additional horrendous consequence of war, shouldn't we consider it?
posted by MrVisible at 1:00 PM on April 27, 2013 [2 favorites]


If there's a way to avoid that additional horrendous consequence of war, shouldn't we consider it?

Actually, no. Once there are no societal consequences of war, there becomes no societal imperative to avoid war. War becomes invisible. The fact that past wars have not stopped all war is not a reason to give-up trying to stop war and make it all nice and clean and automated so the civilians don't complain.

A society that wages war deserves to live with the messy, ugly and inconvenient consequences.
posted by Thorzdad at 1:09 PM on April 27, 2013 [1 favorite]


Out of pure curiosity, open question, is this currently a possibility? What is the mercy algorithm? Does benevolence become a stochastic process where x amount of targets trigger a life-sparing protocol? I don't have the knowledge to parse how one would design compassion, identification, or clemency acts into a killing instrument, but would be mollified greatly if this was a thinkable/doable possibility.

Rules of engagement are a part of every armed conflict. We have facial recognition algorithms, algorithms which can tell adults from children; we can program a robot to yell a warning easier than we can program it to shoot, in fluent (insert language here). We can program in exclusion zones for religious buildings, hospitals, etcetera. And if you think those are too complex for a robot to integrate, remember that we ask that of our soldiers as well. And we expect them to keep all of this in mind while exhausted, exasperated, over-tired and over-worked, and under fire.

We don't expect our soldiers to make moral judgments in the field. We expect them to follow orders.
posted by MrVisible at 1:10 PM on April 27, 2013 [1 favorite]


My problem with the current system is that the people paying those costs are the poor, the disenfranchised, the children of the powerless. The powerful people who make the decision to go to war don't go to war, and don't send their children.

Some people make the argument that one benefit of the draft was that it forced the middle class at least to face some real danger when the nation waged war (the upper class, of course,had strings to pull), thus leading to broad-based antiwar movements and so on. I don't know to what extent this argument holds water, but the idea of war being a purely economic decision, with no-one in the aggressor country having to put themselves at risk or even be inconvenienced except insofar as they are uncomfortable with a war being waged in their name, is pretty chilling. Because human beings can be comfortable with some awful, awful things being done in their name, as long as they don't have to see the results up close.
posted by No-sword at 1:10 PM on April 27, 2013 [2 favorites]


A society that wages war deserves to live with the messy, ugly and inconvenient consequences.

Unfortunately, the messy, ugly and inconvenient consequences never seem to affect the people who decide whether we go to war. Unless we can solve that, the ugliness of war doesn't serve as a deterrent.

Plus... has there never been a necessary war? Have all wars been waged unfairly, by all parties? If there are necessary wars, wars that need to be fought for good and just purposes in the future, why should the people who wage those wars justly have to pay this additional bill of suffering?
posted by MrVisible at 1:13 PM on April 27, 2013 [1 favorite]


avoiding the dark future of Terminator?

Actually, the average Screamers (1995) and it's lousy sequel would be the movies with the autonomous drones.

(If only all of the shitty sequels that Lance Henriksen has been in could add up to one good movie...)

(Us Peter Weller and Henriksen fans have been crying into our beer for years.)
posted by vhsiv at 1:14 PM on April 27, 2013 [1 favorite]


Unfortunately, the messy, ugly and inconvenient consequences never seem to affect the people who decide whether we go to war. Unless we can solve that, the ugliness of war doesn't serve as a deterrent.

Then solve it, citizen. Lobbying for an automated, sanitized hole in the sand to bury your head isn't solving anything. All you're doing is trying to make sure you don't have to witness what's being done in your name, while your nation's robots are perpetrating exactly the sorts of horrors that you wish to avoid seeing on others.
posted by Thorzdad at 1:22 PM on April 27, 2013 [3 favorites]


Some people make the argument that one benefit of the draft was that it forced the middle class at least to face some real danger when the nation waged war (the upper class, of course,had strings to pull), thus leading to broad-based antiwar movements and so on.

There were US soldiers in Vietnam from 1965 to 1975. The draft was in place for the entire duration, as was the protest movement. With over 58,000 US troops dead, and well over a million troops total dead, it doesn't seem to me that the draft really goes a long way to preventing war.

The Iraq war was protested en masse, world wide. All that got us was George Bush reminding us to treasure our freedom of speech.

The people who aren't in power don't seem to have the means to prevent the government from waging wars. Why should we have to pay the price for it? Let them send their robots, not our kids.
posted by MrVisible at 1:26 PM on April 27, 2013 [2 favorites]


We don't expect our soldiers to make moral judgments in the field. We expect them to follow orders.

We expect them to make legal judgments, which are essentially the same thing, given that virtually every moral choice in wartime has been addressed (to some extent) by the law.

Let them send their robots, not our kids.

It's a nice idea, but robot wars will only be a preface to the same kinds of wars we fight now, because the only way a war is ever won is with a person on the ground saying "This piece of ground is mine, you can't have it, and my country/state/nation/organization makes the rules on it." Whether that "person" is metal or meat makes no difference, but no country will stop at metal if it doesn't have to.
posted by Etrigan at 1:29 PM on April 27, 2013 [1 favorite]


Then solve it, citizen. Lobbying for an automated, sanitized hole in the sand to bury your head isn't solving anything. All you're doing is trying to make sure you don't have to witness what's being done in your name, while your nation's robots are perpetrating exactly the sorts of horrors that you wish to avoid seeing on others.

I refer you again to the Iraq war protests. If there's a way to make the will of the people clearer to our leaders, I don't have any idea what it might be. We don't make these decisions; why should we have to live with the consequences?
posted by MrVisible at 1:30 PM on April 27, 2013 [2 favorites]


I refer you again to the Iraq war protests. If there's a way to make the will of the people clearer to our leaders, I don't have any idea what it might be.

Voting them out of office at the next opportunity?
posted by Etrigan at 1:32 PM on April 27, 2013


We expect them to make legal judgments, which are essentially the same thing, given that virtually every moral choice in wartime has been addressed (to some extent) by the law.

Great. We can program for that. Probably a lot easier than we can teach a new recruit how to navigate the intricacies of international conflict law while two days' short on sleep and under enemy fire.

It's a nice idea, but robot wars will only be a preface to the same kinds of wars we fight now, because the only way a war is ever won is with a person on the ground saying "This piece of ground is mine, you can't have it, and my country/state/nation/organization makes the rules on it." Whether that "person" is metal or meat makes no difference, but no country will stop at metal if it doesn't have to.

That preface is currently still being fought, by humans. Let's reduce the number of humans killed, even if we can't eliminate all human casualties.
posted by MrVisible at 1:33 PM on April 27, 2013


Voting them out of office at the next opportunity?

Yeah, we couldn't do that either.
posted by MrVisible at 1:34 PM on April 27, 2013 [2 favorites]


If a country can send mechanized units in place of soldiers, won't the population put pressure on them to do just that, in the name of the sanctity of human life?

I just wanted to point out the INCREDIBLE INTERNAL CONTRADICTION OF THIS SENTENCE.
posted by JHarris at 1:42 PM on April 27, 2013 [10 favorites]


Voting them out of office at the next opportunity?

Yeah, we couldn't do that either.


Must not have been important enough to enough people, then.

It's a nice idea, but robot wars will only be a preface to the same kinds of wars we fight now, because the only way a war is ever won is with a person on the ground saying "This piece of ground is mine, you can't have it, and my country/state/nation/organization makes the rules on it." Whether that "person" is metal or meat makes no difference, but no country will stop at metal if it doesn't have to.

That preface is currently still being fought, by humans. Let's reduce the number of humans killed, even if we can't eliminate all human casualties.


Every advance in military technology ever has been claimed to save human lives. A Pope allegedly said that the crossbow was such a terrible weapon that it would end war. And yet, we still have people dying in wars, centuries after the crossbow was made obsolete by even more terrible weapons. Why don't these advances save lives? Because people will still kill each other, and sometimes that is the only thing that stops them. No one who is currently willing to kill another human being for a political point is going to not be willing to kill a robot and a human being.
posted by Etrigan at 1:43 PM on April 27, 2013 [2 favorites]


You're assuming that no advance in military technology has saved lives. It's a pretty strong assumption.
posted by LogicalDash at 1:47 PM on April 27, 2013 [1 favorite]


Until congress and their children are leading the charge into battle, US pacifism will continue to fail. Automation just streamlines the war process--you can command robots to do things human soldiers could not be convinced to do. If robot soldiers were only allowed to attack politicians and other robots, I'd be totally on board.
posted by perhapsolutely at 1:47 PM on April 27, 2013 [1 favorite]


I just wanted to point out the INCREDIBLE INTERNAL CONTRADICTION OF THIS SENTENCE.

JHarris, it works much better in context:
If a country can send mechanized units in place of soldiers, won't the population put pressure on them to do just that, in the name of the sanctity of human life? Isn't it reasonable to ask that the military build robots, who nobody will mourn, to go out and do the fighting?
posted by MrVisible at 1:48 PM on April 27, 2013


I am in favor of robots for the same reason I am in favor of the full metal jacket around every bullet, as mandated by the Geneva convention. Not because it's *good enough* but because it's better than the alternative.
posted by LogicalDash at 1:49 PM on April 27, 2013 [2 favorites]


Etrigan: Must not have been important enough to enough people, then.

Or maybe there was a massive number of other things on the table, and elections are never, ever about single issues, and every four years we end up making a Faustian bargain for the things we need against the things we hate.
posted by JHarris at 1:51 PM on April 27, 2013 [1 favorite]


You're assuming that no advance in military technology has saved lives. It's a pretty strong assumption.

Yeah, I had a big thing about how medical technology has done so, but it wasn't germane.

Military technology saves lives on the winning side. But very nearly the same numbers of people would die in wars if we started them with robots, because after the robots killed each other, they'd still have to kill humans. If you're not scared enough of an Apache gunship or an Abrams tank not to bother fighting the U.S., the Mark VII Destructinatorbot isn't going to keep you from trying to blow it up either.
posted by Etrigan at 1:51 PM on April 27, 2013


No one who is currently willing to kill another human being for a political point is going to not be willing to kill a robot and a human being.

Well, I'm hoping that if we send robots instead of some of the human beings, maybe fewer of the human beings we send will get killed.
posted by MrVisible at 1:52 PM on April 27, 2013


Must not have been important enough to enough people, then.

Or maybe there was a massive number of other things on the table, and elections are never, ever about single issues, and every four years we end up making a Faustian bargain for the things we need against the things we hate.


I think you just said the same thing I said, but with more words.
posted by Etrigan at 1:52 PM on April 27, 2013


But very nearly the same numbers of people would die in wars if we started them with robots, because after the robots killed each other, they'd still have to kill humans. If you're not scared enough of an Apache gunship or an Abrams tank not to bother fighting the U.S., the Mark VII Destructinatorbot isn't going to keep you from trying to blow it up either.

I don't grant that premise. If you've got robots holding a position instead of humans, and the enemy takes the position, they've killed robots and not humans. Of course your opponents will try to destroy the robots; I'd rather have them destroy robots than spend the same amount of resources and time to destroy soldiers.
posted by MrVisible at 1:57 PM on April 27, 2013


I find it disturbing that some of the objections here are based on the fact that it might also ban landmines, when landmines are fucking evil, disgusting things that plague any place they're implemented for decades to come, and Security Council members such as the US and Russia actively hinder attempts to get rid of them for good.
posted by mobunited at 1:57 PM on April 27, 2013 [13 favorites]




Of course your opponents will try to destroy the robots; I'd rather have them destroy robots than spend the same amount of resources and time to destroy soldiers.

And then we'll give up, because the robots couldn't get the job done? Of course not.

I'm not saying that no human lives will be saved. I'm saying that it's not a panacea, that your idea of "Let them send their robots, not our kids," won't ever actually happen. "Let them send their robots and fewer of our kids," sure. But the horrors of war somehow never manage to keep us from inflicting them, however much we try to make them less horrible, on the next generation.
posted by Etrigan at 2:03 PM on April 27, 2013


Ban the earworm now -- while it's still only an analog sound-based threat.

Eventually it's going to be instantiated in drone hardware---small enough to actually fly around, search and find you, crawl in to your ear and take over your mind.

"A-rum pum pum pum" y'all.
posted by hank at 2:05 PM on April 27, 2013


Must not have been important enough to enough people, then.

Woodrow Wilson won re-election based substantially on the slogan "He Kept Us Out Of War" in 1916. In 1917, America was sending troops to fight in WWI. Are you really prepared to argue that the American people really have a say in what wars we participate in?

In the case of George Bush, there was no indication when he was elected that he was going to engage in the Iraqi war; he didn't run on that platform. By 2004, the decison had been made, and was irrevocable. And his opponent's position on the war was, shall we say, difficult to parse. Meanwhile, Bush was acting as if the war was just about over, that we just needed to do an eensy bit more clean-up; he'd already declared the mission accomplished, after all.

The American people were never asked if we wanted to go to war; our expressions of opposition to the war were ignored.

Or do you want to be the one to tell a parent that you're sending their son or daughter into combat because you think it's only fair; after all, the country voted for a leader who, as it turns out, wanted to go to war, even though there was no way to know that at the time of the election?
posted by MrVisible at 2:14 PM on April 27, 2013 [1 favorite]


I'm not saying that no human lives will be saved. I'm saying that it's not a panacea, that your idea of "Let them send their robots, not our kids," won't ever actually happen. "Let them send their robots and fewer of our kids," sure.

I'm okay with that. I know there aren't any 100% solutions when it comes to war.
posted by MrVisible at 2:18 PM on April 27, 2013


I'm all for an defensive web of drones protecting the country and drones delivering pizza and Amazon packages to my apartment balcony, but what good is it doing killing people on the other side of the world with drones? How is that improving our security? It is such obvious empire behaviour.
posted by Joe Chip at 2:26 PM on April 27, 2013 [2 favorites]


"I think we are already there. If you asked me to go and make an autonomous killer robot today, I could do it. I could have you one here in a few days," he told reporters.

Way to exaggerate your position for rhetorical value, AI reasearcher dude. The "autonomous killer robot" that you could deliver in a few days would be, essentially, the sentry gun from Aliens. You can make a gun that shoots at movement, and you can put it on a car or an aircraft. Big deal. A machine that shoots at everything is not very useful to the military, and is never going to be built or deployed for that reason.

AI is decades away from delivering anything close to the bogeymen imagined by these activists --- robots that are able to perform complex military missions without any human guidance.

Meanwhile, everything about high-tech weapons development is moving towards increased precision with decreased collateral damage. Assassination is an ugly thing, but if military conflicts of the future can be reduced to a series of high-profile assassinations, it's a net win for civilians.
posted by qxntpqbbbqxl at 2:26 PM on April 27, 2013 [3 favorites]


I find it disturbing that some of the objections here are based on the fact that it might also ban landmines, when landmines are fucking evil, disgusting things that plague any place they're implemented for decades to come, and Security Council members such as the US and Russia actively hinder attempts to get rid of them for good.

Actually, it's not the objections that are based on that; it's the estimations of the likelihood of such a ban succeeding. If we haven't been able to ban even the simplest of autonomous weapons from the battlefield, then how likely does it seem that an international ban on more, shall we say, discriminatory killers will be put in place?

We can't ban landmines, so how can we expect to ban robots which can be programmed to behave according to the rules of engagement?
posted by MrVisible at 2:27 PM on April 27, 2013 [1 favorite]


Are you really prepared to argue that the American people really have a say in what wars we participate in?

The American people were never asked if we wanted to go to war; our expressions of opposition to the war were ignored.


For one thing, public opinion polling -- despite the millions of people who took to the streets -- was running much closer to 50/50 than people want to remember. The legislators who voted for it were doing so because of public opinion (or their perception of it), not because they were beholden to arms manufacturers or the sheer magnitude of the Presidency.

The American people do have a say -- the fact that it's not the say that you personally like doesn't mean it's not there.
posted by Etrigan at 2:30 PM on April 27, 2013 [1 favorite]


"Consistent with the anti-war sentiment of the protests, in the months leading up to the Iraq War, American public opinion heavily favored a diplomatic solution over immediate military intervention. A January 2003 CBS News/New York Times poll found that 63% of Americans wanted President Bush to find a diplomatic solution to the Iraq situation, compared with 31% who favored immediate military intervention."

Can you think of any way that the American people could have prevented Bush from waging the Iraq war at that point?

And my question still stands; does the way a nation votes justify sending its soldiers to war, if there are viable alternatives? Do we send soldiers instead of robots, because you believe that the way the nation votes means that the soldiers should suffer for it? You want the communities in America to have to deal with the devastation of loss, caring for the wounded, dealing with the PTSD, because... why? What purpose does it serve, if it can be, at least partially, avoided, to have these citizens suffer?
posted by MrVisible at 2:40 PM on April 27, 2013 [1 favorite]


But if your correlate 'autonomous robots' with land mines you maybe come closer to her point. The impact has been much much more far reaching than intended at the time of implementation. I will mine this field and stop my enemies. Ten years later that field can't be used because its got unknown mines in it. Extrapolate that with robots - and that could be a series of motion activated turrets fed from a central magazine - and suddenly you have a similar scenario: introduce flying robots or zeppelin-based weapons systems and...

Quibble about definition of robot all you like, but letting a device out in the wild the long range ramifications of which are not thought about at all and it seems like a reasonable thing to call attention to.
posted by From Bklyn at 2:45 PM on April 27, 2013 [4 favorites]


From the same secondary source:
Days before the March 20 invasion, a USA TODAY/CNN/Gallup Poll found support for the war was related to UN approval. Nearly six in 10 said they were ready for such an invasion "in the next week or two." But that support dropped off if the U.N. backing was not first obtained. If the U.N. Security Council were to reject a resolution paving the way for military action, 54% of Americans favored a U.S. invasion. And if the Bush administration did not seek a final Security Council vote, support for a war dropped to 47%.

An ABC News/Washington Post poll taken after the beginning of the war showed a 62% support for the war, lower than the 79% in favor at the beginning of the Persian Gulf War.
Can you think of any way that the American people could have prevented Bush from waging the Iraq war at that point?

I can think of many, but you're not really asking that, you're just saying, "I don't think it could have happened." We shall have to agree to disagree, I suppose.

And my question still stands; does the way a nation votes justify sending its soldiers to war, if there are viable alternatives?

I really don't get what you're asking here -- if a nation votes to send humans to war instead of robots, then yes, that justifies it. It would be stupid, but war is generally pretty stupid anyway.

You want the communities in America to have to deal with the devastation of loss, caring for the wounded, dealing with the PTSD, because... why?

You're going to want to check the ground before you start suggesting that I don't care about what happens to American service members in particular.
posted by Etrigan at 2:48 PM on April 27, 2013


I can think of many, but you're not really asking that, you're just saying, "I don't think it could have happened." We shall have to agree to disagree, I suppose.
No, I'm asking. If you have a way for a nation's people to stop their country from entering a war they disagree with, please share.

And, since we both care about the wellbeing of our servicepeople, then why not send robots in their place whenever possible? What does it matter how the nation voted, and whether our ideas on war were reflected by the actions of our governing body? What's wrong with sparing our soldiers the horrors of war whenever possible?
posted by MrVisible at 2:55 PM on April 27, 2013


It's interesting that almost every commenter so far thinks of themselves as the pitcher rather than the catcher.
posted by goat at 3:01 PM on April 27, 2013 [15 favorites]


I'm not saying I'm against the use of robots; I'm just saying that it's not going to make war any nicer nor much less bloody.
posted by Etrigan at 3:03 PM on April 27, 2013


I'll settle for slightly less bloody, gladly, any day.

It's interesting that almost every commenter so far thinks of themselves as the pitcher rather than the catcher.

Good point. For my part, I'd rather have to fight off a wave of invading robots than human beings. I could feel a lot better about disabling a robot than I would about killing a person. Even if they were invading my country, I'd still have to think about a person's family, their lovers, their friends learning about their death at my hands. If I managed to defeat a robot, my victory dance would be pure joy, unadulterated by moral concerns.

Being killed by a robot or a human being wouldn't make much difference to me.
posted by MrVisible at 3:12 PM on April 27, 2013 [1 favorite]


The Founders, each one of them, personally intended and expected the Constitution to protect every American's individual right to own Lazy Guns.
posted by snuffleupagus at 3:15 PM on April 27, 2013 [3 favorites]


What purpose does it serve, if it can be, at least partially, avoided, to have these citizens suffer?

Have you heard the theory that (American) Football helmets contributed to the plague of brain damage in American footballers? According to this theory the problem is not the helmet per se, the problem is that having the safety features of the helmet gives people a cue to engage in riskier behavior.

Roughly the same model would apply here. It becomes a lot easier to casually justify a military presence somewhere (cough cough drones cough) when the downsides to the person on the street can only be discussed in very abstract terms.

If you have a way for a nation's people to stop their country from entering a war they disagree with, please share.

I'd be happy, for the moment, to prevent people from having the tools to absentmindedly start more wars than they already do (cough cough drones cough). And preventing occupations from having what I would imagine would be incredibly effective tools for suppression.

Robots shed culpability. If a robot shoots someone you can say "they knew it was past curfew, we told them that we'd programmed robots to shoot protestors, it's their own damn fault". If a robot malfunctions you say "well, it's a robot, they do that. Maybe we'll fire a programmer". Once you decide to deploy a robot, there's no (easily indicated) human culpable besides victims.
posted by tychotesla at 3:20 PM on April 27, 2013 [4 favorites]


If a robot shoots someone you can say "they knew it was past curfew, we told them that we'd programmed robots to shoot protestors, it's their own damn fault".

That situation happens now, with human beings on both ends of things. The soldier has orders to shoot; if they get caught disobeying orders, it's a court-martial. The civilian ends up dead anyway. Having a human being do the shooting seems inhumane under those circumstances, doesn't it?

Robots shed the unjust culpability of the soldiers on the ground, not the people who ordered them to be there.
posted by MrVisible at 3:25 PM on April 27, 2013


Roughly the same model would apply here. It becomes a lot easier to casually justify a military presence somewhere (cough cough drones cough) when the downsides to the person on the street can only be discussed in very abstract terms.

I'm really uncomfortable sending people to die to symbolize the difficulty of conflict to the general public. In fact, I'd bet that would be pretty morally reprehensible by any measure.
posted by MrVisible at 3:28 PM on April 27, 2013


On preview, Mr. Visible beat me to the punch, but I'll add:

If we're going by the law of war, the comparison would be between a soldier for whose conduct their superiors are answerable to. Is the situation really that different with excesses and atrocities in the past few major wars? We still try and explain away accidental shootings by pointing out the dead failed to abide by publicized curfews, denied zones, etc. The chain of command still tends to atomize the blame. Maybe someone falls on their sword along the way. The dead are still dead.

If we're going by civil law (lowercase civil, i.e. torts under the common law) then at least on the conceptual level there would be good arguments for strict liability under products liability or else as an ultrahazard (like demolitions).

Mr. Visible's comment is perhaps darkly eponysterical, however, because what drones offer (beyond lower risk) that is harder to achieve with even a platoon of men is deniability.
posted by snuffleupagus at 3:32 PM on April 27, 2013 [1 favorite]




We don't expect our soldiers to make moral judgments in the field. We expect them to follow orders.

This is factually false. See Nuremberg trials.

In addition, today's American soldiers are, at least on paper, not obligated to follow an illegal order.
posted by drjimmy11 at 4:40 PM on April 27, 2013 [1 favorite]


Another side effect of the removal of human soldiers is: it eases the execution of military actions by people other than the military.

You will note most of Obama's drone war is executed by the CIA and takes place in Pakistan and Yemen, countries who we are not at war with and are in fact our allies. The more we labor under the myth of robotic murdering machines being "surgical," the more the line between war and political assassination blurs.

Even someone as amoral as Obama would probably not send a CIA agent to Yemen to personally shoot someone without trial, and tell him to not worry if he hits a few innocent women and children standing around too. But he'll send a drone and be proud of it.
posted by drjimmy11 at 4:46 PM on April 27, 2013 [6 favorites]


They've been kicking around the idea of replacing the UN peace-keeping troops on the Cote d'Ivoire-Liberian border with drones, when the troops are drawn down; apparently they've OKed them for monitoring the border between DRC and Rwanda.
posted by ChuraChura at 4:49 PM on April 27, 2013 [1 favorite]


Roombas are autonomous. I'm sure DARPA has an assault version.
posted by Mr. Yuck at 4:55 PM on April 27, 2013 [1 favorite]


I guess the hot new industry for any country that might find itself on the business end of a robot army is going to be robot-communications-hacking. Imagine if you could redirect a whole group of them to start shooting their own side, for example.

It does feel very strange to watch people argue about the reduced loss of life on our side thanks the vastly increased efficiency of killing we'll see on the other side. Of all the things to be happy about...what am I supposed to do, cheer?

And of course the assumption that our government would hesitate to use this on unruly citizens. There is a hope, however small, that at least some cops or national guardmen would hesitate to mow down a crowd of peaceful protesters. None whatsoever that an armed drone would.
posted by emjaybee at 5:19 PM on April 27, 2013 [6 favorites]


Man they are going to be pissed when they find out we tried to ban them before they even existed. This may even make them want to kill us.
posted by Ad hominem at 5:40 PM on April 27, 2013 [1 favorite]


Has anyone posted the Campaign's web site yet? StopKillerRobots.org

Amusingly, it has the Google site description:" A description for this result is not available because of this site's robots.txt – learn more."

They need some work on the philosophy. For instance they write:

"Allowing life or death decisions to be made by machines crosses a fundamental moral line. Autonomous robots would lack human judgment and the ability to understand context."

But there really isn't any such thing as an autonomous machine. It doesn't "decide" it follows the instructions it was given. If this then do that.

As is pointed out upthread, all kinds of things are already autonomous killing machines. Artillery shells, Heat-seeking missles, etc. etc.
posted by Jahaza at 5:41 PM on April 27, 2013 [3 favorites]


He should tackle anti-time travel next; the last thing we need is some nut giving Hitler the A-Bomb or telling Squanto his people would be MUCH better off if they slaughter every white face on sight.
posted by Renoroc at 6:25 PM on April 27, 2013


As is pointed out upthread, all kinds of things are already autonomous killing machines. Artillery shells, Heat-seeking missles, etc. etc.

Maybe it is a sliding scale, but there's certainly Lines being crossed right now of which important people aren't giving much thought to the implications. The biggest one of all is that any state that becomes the target for unmanned drones is immediately going to take up a huge grudge against you, and we are rapidly approaching a world where we need more friends, not less.
posted by JHarris at 6:45 PM on April 27, 2013 [1 favorite]


How can you make a rule that says a weapon is acting autonomously when controlled by electricity, but not when controlled by string?

It sure is a toughie, but I believe I actually have the ability to discern a hair's bit of difference between a piece of string attached to a bomb and a weaponized robot autonomously analyzing is environment, choosing targets, and mercilessly hunting them down using digital sensors, tens of millions of lines of code and billions of clock cycles per second.
posted by crayz at 7:08 PM on April 27, 2013


Amusingly, it has the Google site description:" A description for this result is not available because of this site's robots.txt – learn more."

They hate all robots, not just killers.
posted by crayz at 7:10 PM on April 27, 2013 [1 favorite]


This entire thread is likely the most depressing thing I've read all year.
posted by tapesonthefloor at 7:24 PM on April 27, 2013 [1 favorite]


How many lines of code are in a light beam sensor?
posted by Etrigan at 7:37 PM on April 27, 2013


Seems like a better way to save lives would be to ban the use of humans in warfare.
posted by condour75 at 7:45 PM on April 27, 2013 [7 favorites]


I think it's telling that this thread continually conflates drones (entirely human-controlled) with autonomous robotic killing machines.

Because we haven't been able to stop anyone from developing drones either, no matter what the arguments against them are. They exist; they're too much a part of modern warfare for any nations to agree to give up. And robots will be the same way.

The delusion that we could stop robots from becoming part of warfare, when we're already as far down the road as we've gotten (have you seen what Boston Dynamics has been up to lately?) is incredibly futile, and if the forces opposed to the inhumane use of robots on the battlefield get hung up on trying to ban them, they're just going to fail. And they'll have missed this narrow window of opportunity to do anything about them.

Right now, we can put together a movement to specify the constraints that should be put on any artificial intelligences that are fielded in combat; what recordings need to be made, in what format, and how available they should be. What protocols, beyond the rules of warfare, should be programmed into them. What responsibilities their commanders have for their action, and how they should be held accountable. What responses should the government take to the deaths of civilians in robotic warfare.

If we argue to ban robots from war, we sound like idiots. We're arguing to bring the moon down from the sky, we're voting to bell the cat. This thread has easily demonstrated how strong the arguments for robotic warfare can be, and how simple it is to bring it all back to protecting our troops, which is almost impossible to defeat. Human history, also, has shown that once a new war technology is possible and effective, it will be implemented. If we let the people who want to ban robots from combat to lead this charge, it's doomed.

Instead, let's accept that it's going to happen, and that we have a unique opportunity to shape what the future is like once it does. Let's talk about the constraints that would make sense to impose on these robots; the surveillance that would be needed for accountability, and who'll be responsible for the inevitable tragedies once they're deployed. And, of course, what the rules should be for robots deployed domestically.

We could make it happen in a way that minimizes the loss of human life, and that makes combat robots less of the nightmare that's foreseen here.

Or we can try to ban them altogether, and be cast as hysterical ninnies who'd sacrifice our soldiers' lives for an indefensible principle. And we'd lose.

If you're going to fight, fight smart.
posted by MrVisible at 8:24 PM on April 27, 2013 [4 favorites]


Autonomy is a spectrum, not a binary value. Drones are semi-autonomous now, some more than others--with the Global Hawk, "The user merely hits the button for ‘take off’ and for ‘land’, while the UAV gets directions via GPS and reports back with a live feed." Without some pressure against it, it is almost inevitable that they will become more autonomous, and it's good to consider these questions now.

Tacocopter is a serious project--not about delivering tacos, but about making people think about the possibility of ubiquitous flying robots that could, among other things, possibly deliver tacos.
posted by jjwiseman at 10:06 PM on April 27, 2013 [1 favorite]


I think it's telling that this thread continually conflates drones (entirely human-controlled) with autonomous robotic killing machines.

Drones are not entirely human controlled. They are not remote controlled planes like you build from a kit. Human operators give them high level orders: Fly to this location, take pictures, return to base, land. The drone's software figures out how to make all of that happen, down to minute movements of control surfaces.
posted by qxntpqbbbqxl at 10:09 PM on April 27, 2013


Okay, substitute the word 'primarily' for 'entirely' in my post above.
posted by MrVisible at 10:17 PM on April 27, 2013


let's accept that it's going to happen, and that we have a unique opportunity to shape what the future is like once it does.
posted by flabdablet at 11:29 PM on April 27, 2013


Reading this is like listened to an alcoholic justifying his drinking. In fact, all the discussions we have here about weapons and the military sound like addicts obsessing on their addictions.

America has zero moral authority to have any more weapons of any type. American foreign policy is simply one war crime after another. America started a war based on lies that killed hundreds of thousands of people for nothing - and what was the result? Nothing. No soul-searching - no one went to jail - no truth and reconciliation - no one was disgraced - no one even lost their job - indeed, America seems to be starting up exactly the same shit again with Iran, under a new administration, having clearly learned not one tiny fucking thing from killing vast numbers of people and setting a nation of millions back thirty years.

After that colossal clusterfuck I don't think the United States and its rulers should be trusted with a butter knife.

America doesn't need any new weapons because you are the ones who are the danger to everyone else. You have more than half the weapons in the entire world, and yet it's never enough - and whenever this is discussed, we always get a choice - "boots on the ground" or drones - the third choice, not killing people who have offered us no harm is simply not an option.

Again, it's just like a drug addict - "Should I prostitute myself to buy cocaine or should I commit armed robbery?" The third option, "I should not buy cocaine," is simply never discussed because it's too impossible to even bring up.

And in fact, we aren't even getting that choice everyone's pretending we have. If the United States were really going to start to decommission their missiles and bombers, mothball their battleships, stop recruiting new soldiers - if drones were really going to start to replace conventional warfare, that would be something I could be enthusiastic about.

But that's not at all what we're getting. We're getting the drones and we're getting more conventional warfare too.

I should also add that the United States government is neither honest enough nor responsible enough to be allowed to use drone weapons. Even in the short time drones have been a real force, the Administration has simply lied through their teeth time and again to the American public about their use of drones (source) - and it's impossible for reporters to follow drones, so it's extremely hard for us to get any unbiased information. A country that brags about its complete lack of respect for international law and even for its own Constitution should not be allowed any new weapons, particularly weapons that allow it to kill suddenly and secretly half a world away - frankly, it's a terrible shame the United States is allowed to keep the weapons it has.

This very short story says it more clearly than I ever could.
posted by lupus_yonderboy at 11:44 PM on April 27, 2013 [3 favorites]


That's all very well, but we can't stop voting for the lizards; the wrong lizard might get in.
posted by flabdablet at 11:53 PM on April 27, 2013 [1 favorite]


Every necessary murder should be conducted by the person who decided it was necessary, using his thumbs, pressing slowly through the eyes of the victim while he begs for mercy and pleads for the sake of his wife and children. Every step away from that makes what is arguably a necessary evil more and more evil. Assigning others to kill and taking away their autonomy and ability to exercise mercy by threatening their lives or jobs or liberty makes war murder more evil. Removing yourself from the scene, so that you are insulated from the cries of pain and the repercussions -- through drones, missiles, or just the chain of command itself -- makes war murder more evil. Rigging up mines or trip wires that cannot distinguish children from soldiers makes war murder more evil. And designing robots that run programs which, depending on the outcome, result in a human dying or not, makes war murder more evil. It's all bad. Perhaps it's still necessary, who knows. I'll leave that debate to those who like thinking about what kind of murder or torture works best, and which circumstances call for which kind of death. But regardless of necessity, every decision we make to remove ourselves further from necessary evil makes that evil more so.
posted by chortly at 12:07 AM on April 28, 2013 [4 favorites]


It's also funny that some of those who are arguing "but what about land mines" seem unaware of what Jody Williams won her Nobel for.
posted by chortly at 12:10 AM on April 28, 2013 [2 favorites]


Imagine that America uses its insane defense budget to replace all soldiers with autonomous robot troops instead. I'd bet that other countries could be persuaded, as the cost lowers, to replace their forces with robot soldiers as well; what's the point of sacrificing your people in a war where the opposing country has nothing but parts and bits to lose -- without the threat of human loss, would engaging in war against that country carry any real weight?

Imagine a future where all countries can afford armies composed just of autonomous killing machines. If the rules of war are followed, where only soldiers, and not civilians, are targeted, what real pressure could war carry in convincing a government to change its policies or concede land? It seems to me that war would become a situation similar to children kicking down each other's sand castles at the beach; no real harm is done, you can just build another.

In a world of only autonomous weapons placed at war, I can imagine two scenerios: either the use of fighting as a political tool ends, or the rules of war change to allow targeting civilians as a way for violence to maintain its threat.
posted by Theiform at 1:04 AM on April 28, 2013 [1 favorite]


If the rules of war are followed, where only soldiers, and not civilians, are targeted

If I had some bread I could make a cheese sandwich if I had some cheese.
posted by flabdablet at 2:29 AM on April 28, 2013 [4 favorites]


If the rules of war are followed, where only soldiers, and not civilians, are targeted

No one has ever, in the history of humanity, fought a war only with, or only because of, soldiers. Wars are fought because Leader A wants to do something in Territory A. That may be "exploit the resources" or "tell the local populace what to do" or "keep Leader B from doing something in Territory A." Leader A wants to do whatever he wants to do so much that he is willing to kill for it. If Leader A is willing to do that, then Leader B has to do the same thing, or else he gives up. That is war. It will always be that way.

Ever seen Robot Jox? It's a bad movie, with many flaws, but its central one is that humanity will ever be able to "outlaw war" and use some kind of proxy mechanism (e.g., giant robots). If that happens, then "proxy war" would be so cheap that it would become common. Imagine that you get Saudi Arabia and all its oil, and all you have to do is say that you want it and win some sort of contest. Any rational Leader A would make territorial claim after territorial claim, over and over, because eventually he'll get lucky. Eventually, Leader B would say, "No, fuck that, you're being a dick. I'm not betting Saudi Arabia and another robot for the seventh time this year." Leader A says, "Well, then I'm just going to take it."

And then you have real war again.

Are autonomous robots good or bad? There are many arguments to be made. But the argument that they'll keep people out of harm's way is, at best, only half true.
posted by Etrigan at 5:33 AM on April 28, 2013 [3 favorites]




Ever seen Robot Jox?
Yes, and curse you for reminding me. I had managed to purge this from my memory almost entirely.

It's a bad movie, with many flaws
That's possibly a generous evaluation.

the possibility of ubiquitous flying robots that could, among other things, possibly deliver tacos.
Make this a commonplace and at least in the US context, the expectation of privacy vs. government intrusion into and warrantless surveillance from the airspace over your yard or your neighbors yard may be seriously eroded due to shifting expectations of privacy (or lack thereof) brought on by "general public use." If yard-buzzing taco robots are in general use, the public may have to accept similar cop robots (simplifying here but that's the rub. Here's NPR's take from this March.) On the other hand, some of the recent search & seizure cases that have gone to the Supreme Court have produced different results and may offer different doctrinal framing (the oddball strict construction approach to GPS in Jones, or in this year's drug dog/curtilage case).

That's a subtext for the FAA rulemaking that threatens to swallow the aviation/safety focused aspects of the rulemaking itself. The ACLU is making an effort to get out in front of it.
posted by snuffleupagus at 6:55 AM on April 28, 2013


A robot probably isn't programmed for mercy, nor will it sometimes shoot warning shots, or complain to its commander afterward.

Unless you program them for that.


That always goes well.

DROP YOUR WEAPON. YOU HAVE TEN SECONDS TO COMPLY
posted by brundlefly at 9:08 AM on April 28, 2013 [2 favorites]


It's interesting that almost every commenter so far thinks of themselves as the pitcher rather than the catcher.

It kinda bemuses and depresses me that this is such a given. That, and the fact that only a few rich countries would have these, and they'd likely get treated as WMDs in some way where only the elite few were allowed to officially have them. And then we'd be using our robots to blow up robot factories in have not countries we didn't approve of.

Lets get real with this though. You can buy an aerial drone on alibaba for like 10 grand. I think you can even order it with the ability to carry weapons. It's not going to take very many years once these things are out for someone to make a robot that looks like a dog, or some kind of animal that's also a bomb and walk it in to some city center.

Then what the fuck happens?

That's kinda something that's been skipped over here. Fuck fancy targeting and such, as soon as there's a good enough looking robot that can follow gps and maybe track it's position to stay on the sidewalk, etc then a robot is the ultimate "suicide" bomber. A sort of walking cruise missile.

Shit, I could see this being the next 9/11 type situation
posted by emptythought at 11:39 AM on April 28, 2013 [1 favorite]


What makes us think autonomous weapons won't kill the "wrong" people, just as human-directed guns (and other weapons) all ready do? How do we prevent the "wrong" people/entities from acquiring, programming, and deploying assault rifles, I mean drones, I mean autonomous weapons? What if WE are the "wrong" people, or we are using the "wrong" weapons for the "wrong" reasons?

Trying to affect early-on more ethical hardware, programming or deployment requirements for ANY new weapon is much more effective than once they're in use.
posted by Dreidl at 5:17 PM on April 28, 2013 [2 favorites]


Dreidl, that's something i actually think about quite a lot when i'm absentmindedly daydreaming.

For instance, if you make it only smart enough to essentially work like units in an RTS game, where you tell it "go over here taking this route, and attack these things like this. Defend yourself along the way but keep moving towards X" or "Escort $THING from A to B keeping XYZ distance and looking specifically for Y", then of course they'll make the wrong calls unless they're given more specific information than we could probably ever have, and are far better at recognizing things than the average person.

Now, if they're smart enough to make their own calls to some degree, then i don't see how they would be any less likely to make the wrong calls either.

Where this could get really interesting and scifi-ish, is when a robot is programmed to make it's own calls in a lot of situations based on lists of rules and requirements. I'm talking about a machine that has about as much agency, in a practical sense as a 1st grader. We tell it to go over here, do XYZ, and check in occasionally or if it runs in to any trouble.
What if it's a really grey area, and their judgement call is different from what a specific human(or many humans) in charge or just in the general populace would agree with?

This is all going to happen, and this is going to be seriously contentious shit over the next 100 years or so. Fuck, looking at technology 50 years ago vs tech now... it wouldn't surprise me if in 50 years i'm having an argument about this with a robot.
posted by emptythought at 7:18 PM on April 28, 2013 [2 favorites]


The Day of the Drones
posted by homunculus at 12:00 PM on April 29, 2013


how drones date online
posted by homunculus at 12:02 PM on April 29, 2013




Poor hydrogen-drone! I'll take one, for free.

The Atlantic: Reports Of Congressional Drone Oversight Are Greatly Exaggerated
posted by the man of twists and turns at 1:04 PM on May 1, 2013 [1 favorite]


Poor hydrogen-drone! I'll take one, for free.

You couldn't ask for a better friend.
posted by homunculus at 1:35 PM on May 1, 2013






Lawfare: Autonomous Weapons Systems: Recent Events and a Response and A Policy Paper on Autonomous Weapon Systems. The paper is at the Hoover Institute: Law and Ethics for Autonomous Weapon Systems: Why a Ban Won’t Work and How the Laws of War Can
The incremental development and deployment of autonomous weapon systems is inevitable, and any attempt at a global ban will be ineffective in stopping their use by the states’ whose acquisition of such weaponry would be most dangerous. Autonomous weapon systems are not inherently unlawful or unethical. Existing legal norms are sufficiently robust to enable us to address the new challenges raised by robotic systems. The best way to adapt existing norms to deal with these new technologies is a combined and international-national dialogue designed to foster common standards and spread best practices. …

Some view the emergence of automated and autonomous weapon systems as a crisis for the law and ethics of war. To the contrary, provided we start now to incorporate legal and ethical norms adapted to weapons that incorporate emerging technologies of automation, the incremental movement from automation to machine autonomy can be both regulated and made to serve the ends of law on the battlefield.
posted by the man of twists and turns at 2:21 PM on May 6, 2013




That DroneShield thing is pretty silly. It's just really unlikely that it's going to be able to pick up the sound of some little electric motors over background noise at a distance of more than some tens of feet.

And DIY droners often build their own platforms for fun, with dozens of different types of electric motors, with the market changing quickly, so I don't see the acoustic identification aspect working too well, either.
posted by jjwiseman at 4:52 PM on May 10, 2013 [1 favorite]










Robot Guns, Combat Facebook: The Tech of the Army’s ‘Last’ Afghanistan Brigade with special guest COSFPS "Kraken", a "mostly autonomous" wired collection of sensors and remote-actuated machine guns.
posted by the man of twists and turns at 3:32 PM on May 20, 2013








With the new developments, I think there needs to be a new post. But I used mine up!
posted by the man of twists and turns at 11:15 PM on May 25, 2013


« Older The Naked Edge   |   The Libra husband is not an easy man to please. Newer »


This thread has been archived and is closed to new comments