Terminator: the Documentary
November 27, 2012 10:39 AM   Subscribe

A report was recently released suggesting a pre-emptive ban on fully autonomous weapons - robots that can pick and choose whom to fire on. If this sounds vaguely alarming to you, don't worry - the Department of Defense issued a directive indicating that fully automous robots may only decide to tase you.
posted by wolfdreams01 (92 comments total) 11 users marked this as a favorite
 


Oh, that's a real comfort. I was getting worried for a second there.

(Too depressed about this stuff to make a substantial comment, sorry.)
posted by RedOrGreen at 10:46 AM on November 27, 2012


This mission is too important for me to allow you to jeopardize it.
posted by srboisvert at 10:46 AM on November 27, 2012 [6 favorites]


I was hoping for a future that was more Asimov and less Terminator
posted by I am the Walrus at 10:47 AM on November 27, 2012 [9 favorites]


Please put down your weapon. You have 20 seconds to comply.
posted by He Is Only The Imposter at 10:47 AM on November 27, 2012 [14 favorites]


All those hours of playing as a Spy in TF2 are about to pay off.
posted by hellojed at 10:49 AM on November 27, 2012 [5 favorites]


Puts a whole new meaning on the term 'kill switch'.
posted by RolandOfEld at 10:51 AM on November 27, 2012


Soldiers and criminals will just start using EMPs and the rest of us will get shot when the robot segfaults or is rooted by 14-year-olds.
posted by Foci for Analysis at 10:52 AM on November 27, 2012 [4 favorites]


The Future is Tomorrow.
posted by chavenet at 10:52 AM on November 27, 2012


"The lawyers tell me that there are no prohibitions
against robots making life or death decisions,"
Mr. Johnson, the weapons inventor
from Tennessee
posted by adipocere at 10:52 AM on November 27, 2012 [1 favorite]


a pre-emptive ban on fully autonomous weapons

LEG-I-SLATE! LEG-I-SLAAAAAAAATE!
posted by Mr. Bad Example at 10:53 AM on November 27, 2012 [15 favorites]


Can I preempt the inevitable pointer to P. W. Singer's totally apposite and yet completely asinine Wired for War by saying that this was one of the worst-written books on an excellent subject that I've ever read? It is a somehow mindless book, a series of cliches with no underlying thesis. It has probably had the effect of temporarily killing off useful inquiries into this subject as well, since it got a lot of undeserved publicity. Anyone else read it and hate it as much as I did? Or perhaps find it useful, by contrast?
posted by jackbrown at 10:53 AM on November 27, 2012 [1 favorite]


Don't tase me, 'bot!!
posted by anewnadir at 10:54 AM on November 27, 2012 [10 favorites]


I remember reading Asimov's I Robot maybe a decade ago and thinking how Naive it sounded from the perspective of the first decade of the 21st century.

In the book the rules for robotics were put in place for political reasons because people were supposedly afraid of robots. But they were presented as having been good ideas anyway.

Of course he totally misjudged human nature. If history shows us anything it's that rather then apply caution in the first place, we tend to jump head-first into everything and only pull back slightly when horrible consequences ensue.
posted by delmoi at 10:54 AM on November 27, 2012 [6 favorites]


Also, the interesting this it that unlike, nuclear weapons, which require an enormous amount of industrial effort and can't be built by one person, people can and do build robots in their garages all the time. And it wouldn't be difficult to take a regular robot and strap a gun, or a knife, onto it and reprogram it into a killbot. The only difference would be the software.
posted by delmoi at 10:56 AM on November 27, 2012 [1 favorite]


a pre-emptive ban on fully autonomous weapons

From my cold, lifeless hands!

wait.
posted by zamboni at 10:57 AM on November 27, 2012 [9 favorites]


so how does a fully autonomous lethal robot differ from a landmine? Only way I can tell is that the former is mobile.

Seems like this is mostly a matter of who's signed the Ottawa treaty.
posted by straw at 11:00 AM on November 27, 2012 [3 favorites]


The only difference would be the software.

Nope, same software would probably work. Hardware swap to a real pistol wouldn't be all that complicated either. Downright trivial if you went with a small enough caliber.
posted by RolandOfEld at 11:00 AM on November 27, 2012


i, for one, welcome our new robotic overlords. i'd like to remind them that as a trusted internet personality, I could be helpful in rounding up others to toil in their underground hangars.
posted by entropicamericana at 11:00 AM on November 27, 2012 [3 favorites]


Number Five is, for all intents and purposes, dead.

Someone get me my Huey.
posted by inturnaround at 11:02 AM on November 27, 2012 [1 favorite]


Jackbrown, I'm with you. As a roboticist (in the defence industry no less) I hate reading Singer's stuff.
posted by olinerd at 11:02 AM on November 27, 2012


Such a ban could not be pre-emptive because such weapons have already been deployed in anger. I'll just say what I said the in the FPP that article was linked in:
Ultimately, if such weapons are going to be deployed, I believe there must be an additional protocol adopted into the CCW covering their use. The notional Protocol VI on Autonomous Weapons should include a symbol sign indicating its presence and a visual and audio signal to indicate it is about to use deadly force, e.g. A very loud horn or bell along with a flashing light. Machines do not need the element of surprise in order to stay alive. The purpose of a weapon such as this is area denial and channelization. If it kills no one, it still did its job. An autonomous turret must give potential enemy targets every opportunity to retreat before firing its weapon. Otherwise it's no better than a mine.
posted by ob1quixote at 11:03 AM on November 27, 2012 [1 favorite]


My report suggests the drones should be empowered to urinate on ordinary people, from extreme altitude. Let's take this out of the realm of metaphor.
posted by iotic at 11:04 AM on November 27, 2012 [1 favorite]


Such a ban could not be pre-emptive because such weapons have already been deployed in anger.

That article doesn't describe automated weapons, does it? It's hard to say for sure because no one's really going to talk about what they have in precise terms, but the description there looks like stationary, grounded, drones operated by humans rather than truly automated weapons.

Drones have their problems, but they do keep humans as a part of the decision making process, automated weapons don't do that.
posted by Bulgaroktonos at 11:10 AM on November 27, 2012 [1 favorite]


ob1quixote: from your link: “The robots, while having the capability of automatic surveillance, cannot automatically fire at detected foreign objects or figures

There's still a human decision in the chain of command with these Korean robotic sentries. The robot requires permission to fire. The HRW report is really concerned with preempting entirely autonomous decision-making.

This is an excellent debate that they are starting here, although the outcome is (I hope) foreordained.

Olinerd: as someone who works in the industry, is this something you guys talk about all the time? And am I right that there's not much chance of the human decision being taken out of the equation?

Straw: the comparison with a landmine is an interesting starting point for thinking about this issue. Autonomous robots are just smarter mines! But of course we've banned mines already...
posted by jackbrown at 11:12 AM on November 27, 2012


Another thing for the Centre for Existential Risk to research.
posted by Wordshore at 11:14 AM on November 27, 2012 [1 favorite]


A robot may not, through inaction, allow a bro to be tased
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 11:21 AM on November 27, 2012 [20 favorites]


Like chemical weapons, one has to wonder what atrocities these weapons will cause before they get proscribed. Except that these devices will be so profitable that this process will likely be undermined by the US government, through the arms manufacturers that have undue control over it. Guaranteed military sales. Renovation program. Spare parts for 25 years, etc.
posted by Blazecock Pileon at 11:30 AM on November 27, 2012 [2 favorites]


But of course we've banned mines already...

Please examine the list of countries that have not signed the landmine treaty. It may alter your definition of "we."
posted by Kirth Gerson at 11:41 AM on November 27, 2012


Someone get me my Huey.

I thought they were called choppers?
posted by cosmic.osmo at 11:42 AM on November 27, 2012 [2 favorites]


I think fully autonomous firing ny drones at people would be a mistake. I could see it like the Phalanx anti-missile gun, which fires autononomously at incoming missiles directed at ships.

But as an anti-personnel weapon it makes little sense, especially when we can have operators with ease.

I see no moral problems with drones per se. Arguments against drones that revolve around the fact of the user not being exposed to fire make zero sense to me. Its like outlawing flak jackets or even more accurately bows and arrows which also allow action at a distance.

Any job an F-15 can do a drone can do better, cheaper and safer, even for those around the target. Its long loiter time and slow speed allow the operator to be more certain and use a smaller munition, creating fewer casualties.
posted by Ironmouth at 11:44 AM on November 27, 2012 [1 favorite]


Bulgaroktonos: That article doesn't describe automated weapons, does it?
The publication of the Human Rights Watch article linked in this FPP has muddied the waters somewhat, so I'm looking for a proper reference. However, at the time the Samsung SGR-A1 was discussed in 2010, I recall coming to the conclusion based on something I read that the unit has a fully autonomous mode.

I note that the patent seems to indicate that the unit has an "unmanned automatic firing" mode. Perhaps I mistook "unmanned automatic firing" for "fully autonomous." I'll keep looking.
posted by ob1quixote at 11:44 AM on November 27, 2012


Like chemical weapons, one has to wonder what atrocities these weapons will cause before they get proscribed.

What is the difference between a drone and a fighter-bomber? They deliver similar payloads from the air. Its a bomber with the pilot somewhere else.
posted by Ironmouth at 11:46 AM on November 27, 2012 [1 favorite]


The purpose of a weapon such as this is area denial and channelization. If it kills no one, it still did its job. An autonomous turret must give potential enemy targets every opportunity to retreat before firing its weapon. Otherwise it's no better than a mine.

-article


This was a special bomb, one issued to each of us for this mission with instructions to use them if we found ways to make them effective. The squawking I heard as I threw it was the bomb shouting in skinny talk (free translation): "I'm a thirty second bomb! I'm a thirty second bomb! Twenty-nine! Twenty-eight! Twenty-seven!..."

-Starship Troopers
posted by Splunge at 11:49 AM on November 27, 2012 [2 favorites]


The article I was thinking of was linked in the previous Samsung SGR-A1 post.

A Robotic Sentry For Korea's Demilitarized Zone, Jean Kumagai, IEEE Spectrum, March 2007
The Samsung robot packs a 5-­millimeter, Korean-made light machine gun. Should it detect an intruder, ”the ultimate decision about shooting should be made by a human, not the robot,” says Yoo, who led the team that designed the robot. But the robot does have an automatic mode, in which it can make the decision.
posted by ob1quixote at 11:53 AM on November 27, 2012


Blazecock Pileon: Like chemical weapons, one has to wonder what atrocities these weapons will cause before they get proscribed.

I doubt they'll be proscribed in our lifetime. Not in any country that matters (by which I mean any country that is likely to deploy them).

Ironmouth: Any job an F-15 can do a drone can do better, cheaper and safer, even for those around the target. Its long loiter time and slow speed allow the operator to be more certain and use a smaller munition, creating fewer casualties.

Well.. Better to say any job an F-15E can do one or two dozen drones can do.. Maybe. I mean, an F-15E can carry 23,000lbs of weapons. Now raw capacity really isn't the deal nowadays, but still..
posted by Chuckles at 12:04 PM on November 27, 2012


Governments should pre-emptively ban fully autonomous weapons because of the danger they pose to civilians in armed conflict

By that argument we should also ban human soldiers from combat zones. Autonomous weapons might actually be less bad. Robots don't get angry, hateful or jumpy. Oddly, robots might be less likely than humans to dehumanize enemy civilians.
posted by justsomebodythatyouusedtoknow at 12:06 PM on November 27, 2012 [2 favorites]


Olinerd: as someone who works in the industry, is this something you guys talk about all the time? And am I right that there's not much chance of the human decision being taken out of the equation?

Well for perspective, I make underwater robots that are not weaponized. But the military folks that use our vehicles still want a human in the loop to interpret data prior to any engagement with what the equipment finds (in our case, usually underwater mines). So given that they're not even willing to give up that control, I have a hard time believing that anyone seriously wants the software firing weapons based on autonomously interpreted data. I mean, maybe some cowboys in a room somewhere do, but the day-to-day folks don't want it.
posted by olinerd at 12:11 PM on November 27, 2012


Be afraid. I'm hoping that by the time this trickles down to me, and my janitorbot finally tosses the mop aside in anger and picks up an automatic weapon, I will be too old to care. Or just old enough to taunt the fucker a little bit before it shoots me.
posted by IvoShandor at 12:13 PM on November 27, 2012


Actually, I'm old and bitter enough to taunt it now. Bring it on, janitorbots!
posted by IvoShandor at 12:14 PM on November 27, 2012 [2 favorites]


By that argument we should also ban human soldiers from combat zones. Autonomous weapons might actually be less bad. Robots don't get angry, hateful or jumpy. Oddly, robots might be less likely than humans to dehumanize enemy civilians.

This is assuming that the robots can selectively target and kill the enemy soldiers. One fear is that automated weapons will target kill everything (either by design or by malfunction) whereas human soldiers must make an affirmative decision before using any kind of weapon.
posted by Bulgaroktonos at 12:15 PM on November 27, 2012 [1 favorite]


I believe the degrees of autonomy are:

1) Operator-controlled (e.g. a drone)
2) Operator permission requested to open fire (the North Korean sentry bots)
3) Autonomous targeting with operator override allowed to prevent fire (Coming Soon!!!)
4) Full Autonomy (See your local movie theater for examples)
posted by wolfdreams01 at 12:15 PM on November 27, 2012 [1 favorite]


people can and do build robots in their garages all the time. And it wouldn't be difficult to take a regular robot and strap a gun, or a knife, onto it and reprogram it into a killbot.

Technically, these actually predate automation: Mantraps. There's also Katko v. Briney.
posted by dhartung at 12:16 PM on November 27, 2012


From that Mantrips link:

Since 1827, they have been illegal in England, except in houses between sunset and sunrise as a defence against burglars.

Whaaaa!
posted by RolandOfEld at 12:19 PM on November 27, 2012


Note that in terms of underwater autonomous weapons, the U.S. has had these for decades, in the form of advanced naval mines. For example, the Mark 60 CAPTOR is a naval mine, that when deployed anchors itself to the ocean floor, listens for submarines, and if a sub is detected, makes its own decision as to whether or not to fire a torpedo at the sub.
posted by RichardP at 12:22 PM on November 27, 2012 [1 favorite]


Re: "Don't tase me bro!"

Yes, so hilarious, the use of the word "bro." Much more reasonable to laugh about that than to be disgusted that someone got electrocuted for trying to ask a question of an elected official.
posted by eurypteris at 12:25 PM on November 27, 2012 [6 favorites]


When killbots are outlawed, only those outside the law will have killbots. Don't you feel better now?
posted by seanmpuckett at 12:42 PM on November 27, 2012




If people dont sign voluntarily we could always have a Butlerian jihad!
posted by Mister_A at 12:55 PM on November 27, 2012 [5 favorites]


Disclosure: I am in the mentat business.
posted by Mister_A at 12:55 PM on November 27, 2012 [1 favorite]






Much more reasonable to laugh about that than to be disgusted that someone got electrocuted for trying to ask a question of an elected official.

I am disgusted that someone would joke about an incident involving an unruly idiot and would also like to say that America is worse than Hitler, if Hitler were a country.
posted by Behemoth at 12:59 PM on November 27, 2012 [2 favorites]


Mantraps.

Since 1827, they have been illegal in England except in houses between sunset and sunrise as a defence against burglars.


Thankfully, since the landmark case of McCallister v. Wet Bandits, America has no restrictions on the use of mantraps. USA! USA!
posted by dubold at 1:06 PM on November 27, 2012 [2 favorites]


The one hard-and-fast rule in my house is that my son may not attach a laser to a robot under any circumstances.
posted by wintermind at 1:14 PM on November 27, 2012 [2 favorites]


since the landmark case of McCallister v. Wet Bandits

I hate you for making me google that.
posted by RolandOfEld at 1:17 PM on November 27, 2012 [2 favorites]


Pentagon: A Human Will Always Decide When a Robot Kills You

Once again, life imitates Onion.
posted by eurypteris at 1:20 PM on November 27, 2012 [1 favorite]


What is the difference between a drone and a fighter-bomber?

The first is very profitable for Lockheed Martin and is easily used for extralegal executions that do not attract much public attention, and the other requires a much more open, long-term, committed decision by the public to apply its use in wartime.
posted by Blazecock Pileon at 1:23 PM on November 27, 2012


MetaFilter: Decide When a Robot Kills You.
posted by Mister_A at 1:31 PM on November 27, 2012


It's interesting to compare this issue to something like automated cars. That is, fully automated self-steering cars (of the kind Google are testing) would be a huge boon to us--they'd kill fabulously fewer people than are killed every day by human drivers and they'd bring about an immense reduction in overall gas consumption. But we will be infinitely more disturbed and panicked by the rare occasions when they do fuck it up than we are by the normal human-caused crashes, because we won't be able to tell intuitively comprehensible stories to ourselves about how they fucked it up and why we, personally, wouldn't do something like that.

Similarly, I would be willing to be that autonomous robotic cops/soldiers will cause staggeringly fewer casualties than human cops/soldiers do. Robots aren't going to be susceptible to contagious fire episodes, for example, or to unconscious racial profiling or to a host of other cognitive errors that cause humans to use unnecessary force in nonthreatening situations over and over again. But we are going to be MUCH more troubled by a robot tasing someone without a human being involved than we are by a human using lethal force. Because we don't like to think about these decisions being made by the fragile, error-prone, panicky people who actually make them--we handwave away cases like that as simply aberrant. We like to imagine them being made by ideal, Solomonic individuals who will weight the situation carefully and make reasoned, humane and defensible decisions.
posted by yoink at 1:33 PM on November 27, 2012 [9 favorites]


I do think it's important here to distinguish between "automatic" and "autonomous".

There are a lot of weapons in the world that "automatically" activate or fire. Most "autonomous" vehicles today are really automatic.

True autonomy would imply that a system could choose NOT to do something even if the if-then statement meant to initiate an action tested true. Just like a human can disobey an order.

Up until that very last point on the autonomy spectrum, you can usually still somehow blame a human -- an operator, a pilot, a programmer -- for a "wrong" decision being made. In a fully autonomous (not necessarily intelligent, but autonomous) system, there would be no human to hold accountable.
posted by olinerd at 1:39 PM on November 27, 2012


Cool Drones
posted by homunculus at 1:47 PM on November 27, 2012


people can and do build robots in their garages all the time.

I just came from a City meeting with a partner who uttered the phrase "it's the quintessential guys building robots in their garages..."

I was abruptly pulled out of my stupor. We have totally jumped the shark as a culture that has any sense of its technology.


This was a meeting about the Arts.
posted by Reasonably Everything Happens at 1:51 PM on November 27, 2012


yoink is right -- there will be security robots that apply lethal force in the future. There will be outrage when they malfunction, but on the whole they will be better than humans.

Right now the Pentagon may have a rule that says robots can only apply non-lethal force. But what would be the rationale for that rule in 20 years when robots are better than humans at judging a situation?
posted by Triplanetary at 1:59 PM on November 27, 2012


entropicamericana: "i, for one, welcome our new robotic overlords. i'd like to remind them that as a trusted internet personality, I could be helpful in rounding up others to toil in their underground hangars."

Trusted, but are you verified, e-citizen?
posted by symbioid at 2:25 PM on November 27, 2012


Any job an F-15 can do a drone can do better, cheaper and safer, even for those around the target. Its long loiter time and slow speed allow the operator to be more certain and use a smaller munition, creating fewer casualties.

For all your arguments in favor of the legal aspects of drones, you sure don't know much about the tactical uses and limitations of current drones. Having the luxury of drone loiter times is the result of first establishing air superiority by manned flights.


I am certainly aware of the fact that in a contested theater of war, one would need air superiority as a prerequisite. However, we are not using these weapons in any contested air space. The governments of Afghanistan, Pakistan and Yemen allow us the overflight rights already. Indeed, the drone strikes in each country are staged from inside the respective countries.

In countries where we are using unarmed drones for surveillance purposes, these drones are occasionally shot down and in one case captured via cyber attack. But these are not currently being used as anti-personnel weapons at present.

In the end, I don't see how the question of whether the use of a drone is proper or if a drone should have the autonomous capability to open fire on human beings is affected by the need to have manned fighter aircraft establish air superiority first. Even if we were using them over a hostile nation for anti-personnel purposes and needed air superiority or even air supremacy to use them, the moral question is still the same: What is the moral difference between a ground strike by a manned aircraft and a drone? I see none.

I do think we should not use autonomous aircraft as an anti-personnel weapons system.
posted by Ironmouth at 2:27 PM on November 27, 2012


From the "vaguely alarming" link:

Operator control units are available that allow semi-autonomous map-based control of a team of robots ... There has also been significant research in the game theory community involving pursuit/evasion scenarios.

Does that mean that those endless hours spent in Doom avoiding Cacodemon might be used for sinister purposes? Time to tell my dad that I wasn't wasting my life.
posted by rtimmel at 2:31 PM on November 27, 2012 [1 favorite]


I see a lucrative future for producers of Faraday cages...
posted by JoeXIII007 at 2:41 PM on November 27, 2012 [1 favorite]


1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
posted by radiosilents at 3:12 PM on November 27, 2012




radiosilents: "1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
"

Actually you forgot the Zeroth Law.

The Zeroth Law really puts everything into perspective, adding a new level of consideration and calculation; within this framework, every thought, word, and action for robot-kind needs exquisite justification.
posted by Splunge at 6:32 PM on November 27, 2012


Leela: "They say Zapp Brannigan single-handedly saved the Octillion system from a horde of rampaging killbots!"
Fry: "Wow."
Bender: "A grim day for Robotkind. Eh, but we can always build more killbots."
(Later)
Fry: "I heard one time you single-handedly defeated a horde of rampaging somethings in the something something system"
Brannigan: "Killbots? A trifle. It was simply a matter of outsmarting them."
Fry: "Wow, I never would've thought of that."
Brannigan: "You see, killbots have a preset kill limit. Knowing their weakness, I sent wave after wave of my own men at them until they reached their limit and shut down."
posted by 445supermag at 6:51 PM on November 27, 2012


Frankly, I'm ashamed that MetaFilter--which I'd previously seen as the epitome of the cooperative, positive, can-do side of the internet--is taking such a defeatist, pessimistic attitude toward the subject. I think that, in the best tradition of the blue, we should be employing a more proactive approach, which is to figure out how we can best and most quickly upload our brains to positronic matrices in order to join the inevitable winning side.
posted by Halloween Jack at 7:01 PM on November 27, 2012 [3 favorites]


I don't hate you.
posted by benzenedream at 9:17 PM on November 27, 2012


Moral Machines
posted by the man of twists and turns at 2:32 AM on November 28, 2012


Related: Medea Benjamin's (codepink founder) recent book on drones: link
posted by Noisy Pink Bubbles at 6:32 AM on November 28, 2012


jackbrown: "Can I preempt the inevitable pointer to P. W. Singer's totally apposite and yet completely asinine Wired for War [...] "

Dangit. I had that on my list of things to read one day. Can you suggest something better for me to read?
posted by bleary at 8:12 AM on November 28, 2012


Seems the plot of the new Call of Duty is tangentially related, at least the way the ever trusty zero punctuation puts it.
posted by RolandOfEld at 9:13 AM on November 28, 2012


Oh man, that link I just posted actually went on a bit of a rant on [white] [[christian]] American privilege and how we view our military toys/video game perspectives that was pretty great in a 'snark as an artform' kinda way.
posted by RolandOfEld at 9:17 AM on November 28, 2012




Are Escaped Zoo Animals Autonomous?
posted by homunculus at 12:47 PM on November 28, 2012


Ironmouth writes "Any job an F-15 can do a drone can do better, cheaper and safer, even for those around the target. Its long loiter time and slow speed allow the operator to be more certain and use a smaller munition, creating fewer casualties."

Granting your hypothesis that a drone will cause lower collateral damage than an F15 (don't know if this is true; especially of a hypothetical autonomous drone) a drone is going to be cheaper to procure, cheaper to maintain, cheaper to fly and require less support than an F15. Which means an offensive force can fly more of them, more often with higher utilization. The increase in number of engagements for the same budget this will allow could mean higher collateral damage in aggregate.
posted by Mitheral at 6:21 PM on November 28, 2012




US Cyberweapons Exempt From Human Judgment Requirement
If bullets, rockets, or missiles are to be fired, tear gas is to be launched, or systems are to be jammed, a human needs to make the final decision on when they are used and at whom they are aimed.
But the policy explicitly exempts "autonomous or semi-autonomous cyberspace systems for cyberspace operations." And the development efforts for those sorts of systems is now being pursued much more openly.
posted by the man of twists and turns at 11:43 AM on November 30, 2012




Interesting that the charges are
56 counts of criminal possession of a forged instrument, grand larceny possession of stolen property and weapons possession
none of which have anything to do with the graffiti per se. I'm guessing the forged instrument charges are because he used the NYPD logo and had some in his possession when arrested. If that is the case I wonder if he could have avoided at least those charges by modifying the logo.
posted by Mitheral at 6:35 PM on December 2, 2012






In other news: Navy Dolphins to Be Replaced by Robots in 5 Years, Human Jobs Probably Safe A Little Longer

Well, that explains the line of picketing dolphins I saw in Boston Harbor earlier this week.
posted by wolfdreams01 at 12:48 PM on December 3, 2012




This is assuming that the robots can selectively target and kill the enemy soldiers. One fear is that automated weapons will target kill everything (either by design or by malfunction) whereas human soldiers must make an affirmative decision before using any kind of weapon.

And here's a reminder of how convoluted those decisions can be:

US military facing fresh questions over targeting of children in Afghanistan: Outrage grows after senior officer claimed troops in Afghanistan were on the lookout for 'children with potential hostile intent'
posted by homunculus at 1:34 PM on December 7, 2012




« Older Eclipse/Blue   |   "Which is another way of saying that Facebook is... Newer »


This thread has been archived and is closed to new comments