You have twenty seconds to comply.
September 21, 2021 4:44 AM   Subscribe

The Scientist and the A.I.-Assisted, Remote-Control Killing Machine [ungated] - "Israeli agents had wanted to kill Iran's top nuclear scientist for years. Then they came up with a way to do it with no operatives present."[1,2]
[T]here really was a killer robot... the debut test of a high-tech, computerized sharpshooter kitted out with artificial intelligence and multiple-camera eyes, operated via satellite and capable of firing 600 rounds a minute.

The souped-up, remote-controlled machine gun now joins the combat drone in the arsenal of high-tech weapons for remote targeted killing. But unlike a drone, the robotic machine gun draws no attention in the sky, where a drone could be shot down, and can be situated anywhere, qualities likely to reshape the worlds of security and espionage.
also btw...
posted by kliuless (92 comments total) 17 users marked this as a favorite
 
repulsive
posted by shaademaan at 5:00 AM on September 21, 2021 [5 favorites]


Good thing weaponry never falls into the wrong hands! This should work out fine.
posted by mittens at 5:28 AM on September 21, 2021 [15 favorites]


This vignette always seemed impossibly futuristic, but not any more:
They sent a slamhound on Turner's trail in New Delhi, slotted it to his pheromones and the color of his hair. It caught up with him on a street called Chandni Chauk and came scrambling for his rented BMW through a forest of bare brown legs and pedicab tires. Its core was a kilogram of recrystallized hexogene and flaked TNT.

He didn't see it coming. The last he saw of India was the pink stucco facade of a place called the Khush-Oil Hotel.

- William Gibson, Count Zero
posted by wenestvedt at 5:39 AM on September 21, 2021 [29 favorites]


mittens: I'd say it's already in the wrong hands? The story is focused on awful lot on the "wow cool robot" aspect and not a lot on the "the Israeli government murdered someone" aspect
posted by JDHarper at 5:43 AM on September 21, 2021 [38 favorites]


If they (for any definition of "they") have a pheromone sample from someone, they probably can get a DNA sample as well, and if they have that, they don't need something as crude as an explosive drone. A genetically engineered virus, highly virulent, with no symptoms except for causing a sudden fatal autoimmune reaction in a tiny number of people with particular genetics, would be much more subtle, and probably possible soon if not now.
posted by acb at 5:49 AM on September 21, 2021 [1 favorite]


Yes, certainly any scientist who thinks that a country should be able to have its own nuclear program is bad and should be subject to execution without trial, unless of course the nuclear program belongs to, eg, the United States, China, France, Israel operating under plausible deniability, etc.

Obviously AI execution is bad, but what is in play is not nuclear equality but nuclear dominance by powerful states - if big/rich states have nuclear programs, it's not surprising that smaller or poorer states want the same thing. It's totally morally illegitimate to execute a scientist for doing something your own scientists do or to pretend that the US and China are somehow moral states that never coerce others through military superiority and therefore "deserve" to have nuclear dominance.

One would prefer that there were no nuclear bombs, what with the threat of ending all life on earth, etc, but it's incredibly stupid to accept that a nuclear scientist in Iran is sketchy but a nuclear scientist in the US is a serious researcher.
posted by Frowner at 5:58 AM on September 21, 2021 [26 favorites]


There's more than a bit of breathless overhype here. This seems to be AI in much the same way that camera stabilization is "AI". There was a human in the loop doing all the macro decision making. This was not a scenario where they gave the gun-bot a picture of the guy and it decided to pull the trigger.

This is dangerous and destabilizing for the reason stated in the article: the Mossad, or any state clandestine force, no longer has to worry about getting away afterwards. Disposable robot assassins makes targeted violence more palatable to those in grandiose, white houses. This is a tool for more state-sponsored killing.

I doubt it has much value for lower-tech terrorism. Remote operated IEDs have been a thing for decades now and are a thousand, or perhaps even a million times cheaper.
posted by bonehead at 6:06 AM on September 21, 2021 [23 favorites]


I see pretty much no difference philosophically between this and a Predator-fired Hellfire missile. Nation states being able to assassinate people at will has been a bad thing for decades. Hell, the only difference between this and polonium poisoning is that Russia likes to pretend is wasn't them.
posted by Ickster at 6:15 AM on September 21, 2021 [4 favorites]


Various cyberpunk role playing games have a thing called a sentry gun. It's a gun on a rod with a camera and computer. The camera scans for various things (facial recognition, rfid badge, anything with heat that's larger than a small dog) , if you fit the correct profile, it lets you through. Otherwise, it starts firing.

They're security without the human factor, set to maintain a perimeter with patrols or people being distracted. I had always thought them further away, something that no one would build in my lifetime. Plop a computer in that pickup with actual working facial recognition software (something that I know is not quite there, but if you're willing to kill more people with mistakes, it's doable) and you've got a sentry gun. The most sophisticated part is actual the software compensating for the recoil and delays. I wonder how long it will be before this shows up in the US in a completely autonomous version. Or elsewhere, given that most of it was recovered intact.

I do have to admit that I am morbidly impressed that they didn't go for the cheaper, lower tech version : six camera drones, each with several pounds of high explosives, six receiving stations, and six operators. This at least avoided more than one murder. I'm actually surprised that we haven't seen a poor man's reaper drone show up on the news yet. Maybe the people behind non-state sanctioned politically motivated military organizations just aren't reading the right science fiction books. Or maybe whichever Israeli intelligence and assassination organization this was had people who were big fans of Breaking Bad.
posted by Hactar at 6:26 AM on September 21, 2021 [4 favorites]


The story is focused on awful lot on the "wow cool robot" aspect and not a lot on the "the Israeli government murdered someone" aspect

Sadly, I think "we" (the "we" that dictates tone of mainstream media coverage, at any rate) have become inured to this - the only novel thing here is "wow cool robot" -- the story covers any number of other assassinations that have been carried out by Israel vs. Iran.

The "robot" aspect is overblown, but it's still worth taking note of and is a troubling step forward of using technology for warfare.

There's nothing surprising here, though. If anything, what's surprising is that this doesn't happen more often / hasn't been employed prior to now. I'm surprised it hasn't been employed domestically (in the U.S., I mean) and it seems like that's only a matter of time.
posted by jzb at 6:46 AM on September 21, 2021


Hactar: Plop a computer in that pickup with actual working facial recognition software (something that I know is not quite there...

The Google AIY Vision kit was Raspberry Pi Zero with a small daughterboard specifically for doing on-device (i.e., not in the cloud) AI stuff with live video.

I say "was" because they sold these a couple of years ago, and I got one on clearance at Target for like $39.

It ships with a script for finding the faces in its field of view, and deciding if they are happy or sad. That was in 2017 or 2019, and the models have only been getting better!

Another of the built-in projects is to use the kit near a birdfeeder, to capture photos of the birds and try to identify the species -- and also for insects, and plants: https://aiyprojects.withgoogle.com/model/nature-explorer/
posted by wenestvedt at 6:47 AM on September 21, 2021 [2 favorites]


The article reads like a press release for "people the US doesn't like should be really afraid now." We'll remote murder you in your own car without shedding a tear and share a golf-clap for the techies who made it possible. Perfectly in character for the NYT.
posted by seanmpuckett at 6:55 AM on September 21, 2021 [6 favorites]


We used nuclear and the threat of nuclear to suppress conventional warfare. Now we're using sentry guns and the threat of sentry guns to suppress nuclear.
I hope we've reached the top of the tech tree, and the end of war is clearly in sight.
posted by otherchaz at 7:01 AM on September 21, 2021 [2 favorites]


[The precision of the robot weapon] at least avoided more than one murder.

The target customer for this thing is a State executive, a President, a Prime Minister, perhaps a National Security Chief. They will love the idea of "clean" targeted violence. A drone strike is better than a commando raid or a covert assassin in that there is little risk to their own forces, but they still result in a lot of collateral damage. There are still "Biden kills kids" headlines the next few days.

The lure of this tech means that non-targeted casualties could be much lower. So that makes approving these hits a lot more politically easy for politicians who have to worry about publicity. It's going to be a whole lot more tempting for the world's democracies to solve their problems with violence than by diplomacy in the future. I'm sure Putin thinks this is the best idea ever. I'm sure Cheney would have made great use of this.

I see this as an entry to more political/state-sponsored violence, particularly for the "white hat" counties. It's "clean" and "surgical" and that makes it easier to say yes to.
posted by bonehead at 7:03 AM on September 21, 2021 [6 favorites]


I see pretty much no difference philosophically between this and a Predator-fired Hellfire missile

The missile would have also killed his with who was sitting right next to him at the time. Horrible experience but his children still have a mother.
posted by sammyo at 7:13 AM on September 21, 2021 [2 favorites]


Think of what Henry Kissinger would have done with this.
posted by bonehead at 7:20 AM on September 21, 2021 [1 favorite]


The target customer for this thing is a State executive, a President, a Prime Minister, perhaps a National Security Chief. They will love the idea of "clean" targeted violence. A drone strike is better than a commando raid or a covert assassin in that there is little risk to their own forces, but they still result in a lot of collateral damage.

I'm actually surprised there haven't been more assassinations by drone. As pointed out upthread, drones and facial recognition aren't new or secret technology anymore. You wouldn't need anything fancy or not commercially available off the shelf, just some cheap drones carrying explosives all set to target a specific person or place. Since politicians often do things like give public speeches where they stand in a previously-announced public outdoor location for long periods, it seems like it wouldn't be all that hard. The drones wouldn't even need to be very accurate, especially if you had a swarm of them converging from different directions.
posted by star gentle uterus at 7:36 AM on September 21, 2021 [1 favorite]


Prediction: the next Fast and Furious film will feature Dominic's crew vs an AI sentry gun/car crew. (They've come pretty close to this already with the "autonomous vehicle stampede" in Furious 8.)
posted by SPrintF at 7:52 AM on September 21, 2021


Think of what Henry Kissinger would have done with this.

To me, the most interesting thing about this by far is that the Israelis briefed Trump and his people on it, and felt the need to pull off this operation while he was still in office.
posted by Halloween Jack at 7:54 AM on September 21, 2021 [1 favorite]


The idea that this is in the hands of a government that considers buying a brand of ice cream to be a form of "terrorism" (their words) frightens the hell out of me.
posted by Glegrinof the Pig-Man at 8:10 AM on September 21, 2021 [5 favorites]


I was having a lot of trouble figuring out why the timeline here is as long as it is: they were about to assassinate him in 2009 but called it off at the last second, and then didn't try again for 12 more years? Then I found the missing piece:

Israel had paused the sabotage and assassination campaign in 2012, when the United States began negotiations with Iran leading to the 2015 nuclear agreement. Now that Mr. Trump had abrogated that agreement, the Israelis wanted to resume the campaign to try to thwart Iran’s nuclear progress and force it to accept strict constraints on its nuclear program.

Sure, fine, let's call off the nuclear accord to appease the ego of an overgrown toddler and his eschatologically-minded electorate, but keep handing money to the Israelis so they can do our dirty work for us. Then we'll all collectively act shocked--shocked!--when the Mossad roll out an entirely new kind of horrific weapon that will immediately find its way into the hands of anyone who wants one.
posted by Mayor West at 8:18 AM on September 21, 2021 [2 favorites]


You wouldn't need anything fancy or not commercially available off the shelf, just some cheap drones carrying explosives all set to target a specific person or place.

I seem to remember something like that happening in Venezuela.
posted by They sucked his brains out! at 8:28 AM on September 21, 2021 [1 favorite]


This will be a problem going forward, since it’s not a difficult thing to do. None of the components of this assassination are particularly expensive or difficult to obtain. The logistics might get a bit involved because you have set it up quickly to avoid suspicion, but that can be overcome with planning and practice. I expect we’ll see more of this in the future from Israel and other countries because there are, effectively, no repercussions. The countries that matter to Israel won’t lift a finger, and at any rate, if they did, how long before Israel changes targets?
posted by JustSayNoDawg at 8:29 AM on September 21, 2021 [1 favorite]


I'm actually surprised there haven't been more assassinations by drone. As pointed out upthread, drones and facial recognition aren't new or secret technology anymore. You wouldn't need anything fancy or not commercially available off the shelf, just some cheap drones carrying explosives all set to target a specific person or place.
I think range is still a limiting factor for making it easy to get the operator away safely but I have had that same thought. My assumption has been that the major governments aren’t jumping into this because it invites the same tactics to be used by their enemies (which is why this case is fairly worrisome), but I’d have expected more attempts from at least groups like ISIS or even domestic terrorists. Face recognition might be tricky but it’s not the only option, especially if your goal is to cause panic just as much as targeting someone in particular.
posted by adamsc at 8:31 AM on September 21, 2021


I see pretty much no difference philosophically between this and a Predator-fired Hellfire missile

See also: the Flying Ginsu, a Hellfire missile fitted with blades instead of explosives.

I'm actually surprised that we haven't seen a poor man's reaper drone show up on the news yet

Azerbaijan deployed a fleet of cheap Turkish and Israel-made drones in the recent conflict with Armenia. The Bayraktar TB2 carries laser-guided munitions. They also deployed "suicide drones" that loiter and then drop down and blow something or someone up on the ground.
posted by BungaDunga at 8:33 AM on September 21, 2021 [3 favorites]


The main purpose of range is to protect a human operator, though. A lot of time and expense goes into training a sniper and so they are far more valuable than the equipment. The other issues are easily solvable problems, given access to modern tech, most of which is dirt-cheap and easily available.
posted by JustSayNoDawg at 8:36 AM on September 21, 2021 [4 favorites]


They also deployed "suicide drones" that loiter and then drop down and blow something or someone up on the ground.

In Gibson's Zero History (2010) a drone is equiped with a Taser in order to attack someone at a distance.
posted by Rash at 9:06 AM on September 21, 2021 [3 favorites]


when a comment points out that it's a good thing this tech hasn't fallen into the wrong hands, and a subsequent comment clarifies that the tech has indeed fallen into the wrong hands, I start to question the reality of everything and whether we need to convene a MeFi Conference on Protocols for Deploying /S Tags
posted by elkevelvet at 9:24 AM on September 21, 2021 [3 favorites]


Here's the thing, I'm not actually opposed to war via assassination of leaders, generals, politicians, media personalities, etc.

I don't like war at all, so war with fewer bodies and destroyed infrastructure seems better than the current way of doing war. I'm not actually seeing why it would be less moral to assassinate a few key people than to bomb a nation and kill tens or hundreds of thousands of innocent people. Both are bad, but I'd rather we live in a world where fewer people get blown to bits.

It also makes the threat up close and personal for the old men who are currentlys ending young people to die. In the current setup a head of government can order a war and it's considered that the only morally correct response is to kill thousands of people who didn't start the war. It's never made sense except from the standpoint of insulating the elites from the consequences of their decisions.

I suspect one reason we haven't seen more assassinations in war is due to a sort of MAD doctrine. Once country A starts offing the leadership of Country B then Country B feels free to retaliate in kind. I'm sure primitive caveman brain telling us to fight "fair" is part too. But I suspect the fear of 1st world government elites of being assassinated themselves is the biggest part of why they didn't do it earlier.

I'm not sure if they've just bought their own hype and feel confident that no two bit little nobody country could **POSSIBLY** retaliate with assassination, or if they're just so into the wizbang factor, or what. But that appears to have changed.

And I strongly suspect that means high ranking government officials in the 1st world will soon be targeted by the victims of their wars.

Drone assassins end the exfiltration problem, but in a world where people have watched their entire extended family be killed by US munitions landing randomly on their country you'll find no shortage of people willing to take the risk of their exfiltration failing.

And, again, I'm not actually sure I think switching from war as a game played by old men with young people doing all the dying to a game where the old men actually take the risk is a bad thing.

If you were a citizen in 1st World Country A and in retaliation for your nation's aggression 3rd World Country B was going to retaliate with a) a bomb, or bunch of bombs, killing lots of random people, or b) attempts to assassinate your politicians, generals, and media personalities who backed the war which would you see as the least bad option?

I'd pick B. I'd rather no one dies and there's no war. But if murderous conflict happens I'd rather (for example) OBL had chosen targeted assassination of the more war mongery anti-Islamic politicians and media personalities rather than knocking down the Twin Towers.

Neither option is good, but one is less bad.

And I'm enough of a class warrior that I want to see the elites have some skin in the game and face the risks they so blithely demand others take for them.
posted by sotonohito at 9:32 AM on September 21, 2021 [14 favorites]


On the drone thing, I think it's just neo-phobia. Does it actually matter in the slightest to the corpse if they were killed by a human pulling the trigger, or by an AI that identified them by face/DNA/scent/gait/whatever? Dead is dead.

The idea that American snipers are fine, but American sniper drones is bad just doesn't compute to me. Dead is deaad, who cares if it was an evolved intelligence running on a substrate of a few kg of fatty tissue, or a created intelligence running on a substrate of siliocon? The end result is identical: people being shot to death.

It seems kind of silly to freak out about drones and be sanguine about human soldiers.
posted by sotonohito at 9:36 AM on September 21, 2021 [1 favorite]


“By killing their leaders, Hassan ibn Sabbah spared the people. His was truly an example of grandmotherly kindness.”
posted by acb at 9:38 AM on September 21, 2021 [3 favorites]


The idea that American snipers are fine, but American sniper drones is bad just doesn't compute to me. Dead is deaad, who cares if it was an evolved intelligence running on a substrate of a few kg of fatty tissue, or a created intelligence running on a substrate of siliocon? The end result is identical: people being shot to death.

The end result isn't really identical, because you can deploy a robo-gun where you wouldn't be willing to deploy a sniper. Robots expand the list of things you can do with acceptable risk to your own people. They deployed a fancy AI gun because a human sniper would have been at too much risk.

Now, there's an argument that this is better, since the alternative might have been just to, eg, blow up Fakhrizadeh's car, killing everyone inside, not just him.

Or it's worse because it makes war and killing politically much easier if it doesn't mean putting our own forces in harm's way.

There's an argument that war can be moral because both sides expose themselves to death. When you're killing people at no risk to yourself, that violates this assumption and makes some people very uncomfortable.
posted by BungaDunga at 10:07 AM on September 21, 2021 [1 favorite]


A carefully crafted virus targeting an individual would be extremely difficult to test to the level of safety required to ensure it didn’t go rogue. It might be theoretically possible; but it seems like an overly complicated way to kill someone.

There are a lot of theories about Havana syndrome and billions spent investigating and nothing definitive has been found. If it’s a weapon or deliberate act; it is surprising that no one has talked, let something slip or come forward to get a big payday from America. If it’s some kind of glitch in listening hardware why haven’t we found it and been able to replicate the effects in a lab or detect it with listening equipment.

It is likely that what we think of as Havana syndrome is actually a cluster of symptoms with different causes all of which have perfectly ordinary explanations like side effects of anti-malarial medication or some other combination of medications, untreated Lyme disease, high blood pressure, benign positional vertigo, etc.
posted by interogative mood at 10:38 AM on September 21, 2021


Of course, Trump was a key factor.
Israel had paused the sabotage and assassination campaign in 2012, when the United States began negotiations with Iran leading to the 2015 nuclear agreement. Now that Mr. Trump had abrogated that agreement, the Israelis wanted to resume the campaign.

If Israel was going to kill a top Iranian official, an act that had the potential to start a war, it needed the assent and protection of the United States. That meant acting before Mr. Biden could take office.
posted by CheeseDigestsAll at 10:49 AM on September 21, 2021 [1 favorite]


The annoying thing is, political assassinations and wars are not an either/or thing. More of a buy one, get the other for free thing, in fact.
posted by Ashenmote at 11:44 AM on September 21, 2021 [3 favorites]


I wonder if they can get Eric Bana to star in the movie about the Iranian Covert Ops team sent to assassinate the Israelis responsible for murdering Mr. Fakhrizadeh?
posted by straight at 12:07 PM on September 21, 2021 [1 favorite]


"It seems kind of silly to freak out about drones and be sanguine about human soldiers."

This comment misses the main point about this technology: At some point in its development, any AI that is advanced enough has the potential to 'go rogue' and escape its original programming or boundaries, in which case humans will have created killing machines that are no longer under the control of their original masters, and that can potentially self-replicate.

Do soliders 'go rogue?' Maybe a tiny, tiny percentage, but there are many social programs, supports, and hierarchies (of varying efficacy) designed to prevent them from berzerking their way through their respective communities. They also don't self-replicate on a timeline terrifying enough to destabilize society.
posted by jordantwodelta at 12:21 PM on September 21, 2021


What kind of crap robot assassin needs to be able to fire 600 rounds per minute? If you had a good robot rilfe, a single bullet would do it. This is just shoddy craftsmanship.

And also murder. But I guess the NYT is cool with that as long as it's murder of people we disapprove of.
posted by qxntpqbbbqxl at 12:33 PM on September 21, 2021


To be fair, it’s not limited to the NYT, though they seem to be the thought leaders outside of government. I doubt most people in the US care that an ally of the US committed murder, because the target was Iranian. Israel seems to get an automatic pass on bad behavior, but against Iran and a few others, they get cheered on.
posted by JustSayNoDawg at 12:43 PM on September 21, 2021


Not sure about the actual organization, but I am in favor of the declared intent:

https://www.stopkillerrobots.org/
posted by newdaddy at 12:50 PM on September 21, 2021


What a horrifying read.
posted by Hutch at 1:07 PM on September 21, 2021 [1 favorite]


The rouge AI that kills us will be some rouge agricultural commodities trading AI. We’ll end up starving to death because the AI wanted to maximize profit on a grain futures contracts.
posted by interogative mood at 1:17 PM on September 21, 2021 [6 favorites]


And also murder. But I guess the NYT is cool with that as long as it's murder of people we disapprove of.

If the Allies had managed to kill Werner Von Braun on his way to the office in 1940, would that, on balance, be a bad thing?
posted by acb at 1:22 PM on September 21, 2021


If the Allies had managed to kill Werner Von Braun on his way to the office in 1940, would that, on balance, be a bad thing?

I guess that depends on how you feel about the U.S. space program...
posted by jordantwodelta at 1:27 PM on September 21, 2021 [2 favorites]


yes. 4 years of data lost would likely be filled by another who may have excelleraed the program.
posted by clavdivs at 1:27 PM on September 21, 2021


"Slaughterbots" was 2017.
posted by doctornemo at 1:48 PM on September 21, 2021 [2 favorites]


I hate being this guy.

The "AI" in question here, at best, helped a trained human sharpshooter aim reliably with a 1600ms unpredictable delay. Maybe it was truly trained with tons of subtle correction data, maybe it was (as noted above) little more that a photographic image stabilizer.

This is another kind of drone with a human pilot. The slope is slippery but the machine is not using facial recognition or in control of firing, so the AI-is-empowered to kill on its own concern undermines the still troubling reality of this situation.

It will happen. There is no way around it, no set of laws or agreements that will prevent an "autonomous" killing that fits squarely in the sweet spot we're all ready to freak out about, but this ain't it.

(Edited to add a missing word)
posted by abulafa at 2:25 PM on September 21, 2021 [3 favorites]


>we need to convene a MeFi Conference on Protocols for Deploying /S Tags
I'll be there, but my particular bike-shed colour is ^ caret for ^real good^ sarcasm.

I'm aware that I post from a glasshouse when it comes to my nation state doing extra-territorial killing, training up secret polices and subverting internal insurgency -- so as a friend to a friend, I'd offer the puzzle "which eye has Iran taken that this eye-for-eye has balance?" (Extra credit will be awarded for commentary which works through the puzzle with reference to "an eye for an eye makes the world go blind.")
posted by k3ninho at 2:35 PM on September 21, 2021 [2 favorites]


There are a lot of theories about Havana syndrome and billions spent investigating and nothing definitive has been found. If it’s a weapon or deliberate act; it is surprising that no one has talked, let something slip or come forward to get a big payday from America. If it’s some kind of glitch in listening hardware why haven’t we found it and been able to replicate the effects in a lab or detect it with listening equipment.

It is likely that what we think of as Havana syndrome is actually a cluster of symptoms with different causes all of which have perfectly ordinary explanations like side effects of anti-malarial medication or some other combination of medications, untreated Lyme disease, high blood pressure, benign positional vertigo, etc.


Yeah, no. Please don't gaslight people with legitimate health issues and tell them it's "all in your head". I know multiple people who were affected by Havana syndrome (now called UHIs - unexplained health incidents), including one who has forced into a medical retirement. There have been literally hundreds of cases in dozens of countries, including western Europe - pretty sure anti-malarial medications are not the issue here.

I have my own theories on the cause of UHIs which I'm not going to share here, but I think the USG knows way more than they're letting on (and way more than they are telling their own employees).
posted by photo guy at 2:46 PM on September 21, 2021 [1 favorite]


So shortly after noon on Friday, Nov. 27, he slipped behind the wheel of his black Nissan Teana sedan, his wife in the passenger seat beside him, and hit the road.
What was Trump doing on Nov. 27?

He was screaming and ranting in the White House and casting about for anything, ANYTHING, which would allow him to set aside the election and hold onto power — and an armed conflict with Iran was already on the table. A retaliatory strike from Iran on Israel would have given him everything he needed and then some.

In 1980 Iran gave us Reagan and a generation of Republican misrule by secretly agreeing with Republicans to hold onto the hostages until after the election.

In 2020 Israel tried to give us Trump and fascism.
posted by jamjam at 2:52 PM on September 21, 2021 [4 favorites]


political assassinations and wars are not an either/or thing. More of a buy one, get the other for free thing, in fact.

As seen when the US and Iran came this close to war after Trump's assassination of Qasem Soleimani, the aftereffects including an entire civilian airliner downed by a jumpy Iranian air defense battery.

What kind of crap robot assassin needs to be able to fire 600 rounds per minute? If you had a good robot rilfe, a single bullet would do it. This is just shoddy craftsmanship.

The gun they used was a customized version of an off-the-shelf machine gun; it fired 15 bullets in quick succession. It seems more like a proof of concept if anything; eventually there will be a streamlined product. There are already aim-assist rifles out there.
posted by BungaDunga at 4:25 PM on September 21, 2021


I have a lot of bad feelings about this news, and they're probably different than the typical ones. I found out about it on the first day of Sukkot (Chag Sameach everyone), my favorite Jewish holiday and perhaps the most joyous one... News about a country that is arguably the one I say the most prayers about. I have no good answers for this one, but I worry what the reaction would be if the tables were turned, with assassinations in other countries.
posted by Flight Hardware, do not touch at 4:26 PM on September 21, 2021


See also: the Flying Ginsu, a Hellfire missile fitted with blades instead of explosives.

Knife Missiles. Really. I wonder what Iain M Banks would have thought.
posted by dazed_one at 4:51 PM on September 21, 2021 [2 favorites]


Or Frank Herbert -‌- sounds like a Hunter-Seeker, to me.
posted by Rash at 5:12 PM on September 21, 2021


Or Fred Saberhagen with his Beserkers (plus the name Saberhagen is pretty metal, as well).
posted by JustSayNoDawg at 6:34 PM on September 21, 2021 [1 favorite]


What kind of crap robot assassin needs to be able to fire 600 rounds per minute?

Interestingly, one of the longest standing records for "longest range sniper kill" was actually made with an M2. Browning machine gun (wiki link), also with a 600 rounds per minute rate of fire, used in single shot mode, out to a distance of 2.2km.

But yes, they fired a quick 15 round burst, probably to make sure the target was killed (a sniper wold be able to determine immediately whether another shot was required).
posted by xdvesper at 7:21 PM on September 21, 2021


In September 2020, a $7.4 billion contract for MQ-9 Reaper drones was announced between the U.S. Air Force and General Atomics. The contract calls for the delivery of up to 36 aircraft per year.

Operation Mole Cricket 19

Abrham Karem

General Atomics MQ-1 Predator

General Atomics (GA) was founded on July 18, 1955, in San Diego, California, by Frederic de Hoffmann with assistance from notable physicists Edward Teller and Freeman Dyson. Originally the company was part of the General Atomic division of General Dynamics "for harnessing the power of nuclear technologies for the benefit of mankind"

Just trying to catch up on all this remote-control killing tech. My point being that Operation Mole Cricket was the first time drones were used to more or less win a battle. And now they've moved ahead with the gun-platform. Just wait until Boston Dynamics climbs onboard.

"What are these things?"
"Land mines with legs."
-- William Gibson, Agency, 2020
posted by valkane at 7:54 PM on September 21, 2021 [2 favorites]


Meet REX (Mk.11): Israeli firm unveils autonomous armed robot to patrol battle zones, borders (Times of Israel, Sept. 13 2021)

The four-wheel-drive robot presented Monday in Lod was developed by the state-owned Israel Aerospace Industries’ “REX MKII.”

It is operated by an electronic tablet and can be equipped with two machine guns, cameras and sensors, said Rani Avni, deputy head of the company’s autonomous systems division. The robot can gather intelligence for ground troops, carry injured soldiers and supplies in and out of battle, and strike nearby targets.

It is the most advanced of more than half a dozen unmanned vehicles developed by Aerospace Industries’ subsidiary, ELTA Systems, over the past 15 years.

The Israeli military is currently using a smaller but similar vehicle called the Jaguar to patrol the border with the Gaza Strip and help enforce a blockade Israel imposed in 2007...


Sentry guns have been a real thing for a while, they're not just video game conceits. But now they're filtering all the way down.


War in the Age of Intelligent Machines, DeLanda (1991):

War in the Age of Intelligent Machines (1991) is a book by Manuel DeLanda, in which he traces the history of warfare and the history of technology.It is influenced in part by Michel Foucault's Discipline and Punish (1978) and also reinterprets the concepts of war machines and the machinic phylum, introduced in Gilles Deleuze and Félix Guattari's A Thousand Plateaus (1980).

A complete PDF isn't hard to find if you poke around for it.
posted by snuffleupagus at 8:39 PM on September 21, 2021


eg, the United States, China, France, Israel operating under plausible deniability, etc.
[...] if big/rich states have nuclear programs, it's not surprising that smaller or poorer states want the same thing.


I've seen this misconception before - I recall someone thinking that Israel was bigger than Russia, which is literally the world's biggest country. Here are some figures (based on Wikipedia) about the actual size/wealth of Israel vs. Iran:

Iran is about 79 times larger than Israel.
Iran has about 9 times the population of Israel.
Iran's regular army has about 3.5 times as many active members as Israel's. Counting reserve personnel, it is 1.5 times as large. This does not include Iran's irregular forces or, e.g., allied forces like Hezbollah.

Things are only a little less clear-cut when it comes to wealth:
Iran has approximately 3 times Israel's GDP (purchasing power parity) or 1.5 times (nominal). It has a larger population, though, so Israel's GDP per capita is about 3.5 times Iran's (PPP) or 6 times (nominal).

Iran has huge natural resources. It has the world's second largest proven natural gas reserves, and the fourth largest petroleum reserves. It also has two uranium mines. Israel doesn't have any natural resources to speak of, although offshore natural gas fields discovered in the 2010s mean that it is now a net exporter of natural gas. Neither country is a major agricultural exporter, but based on Wikipedia Iran has about 550,000 square kilometres of naturally arable soil, vs. about 4,000 for Israel.

I don't think it's fair to dismiss the fact that Israel has a significantly higher GDP per capita, but a larger population is in itself a source of national strength. Also, bear in mind that sanctions have been imposed on Iran since around 2006; one would expect it to be much richer as the effects of the sanctions disappear. In any case, Iran is both much wealthier than Israel (in terms of physical assets) and has a greater overall income than Israel. It certainly isn't poorer than Israel.

So why do people have this fixed idea that Israel is so large and wealthy? Particularly when we're talking about Iran's moral justification to seek nuclear weapons in order to literally commit an act of genocide? I just don't understand it.
posted by Joe in Australia at 5:32 AM on September 22, 2021 [1 favorite]


Things are only a little less clear-cut when it comes to wealth

(warning for youtube bitcoiner businessman cringe)
Currency COLLAPSE and Hyperinflation in Iran (what it looks like)
Part II
posted by snuffleupagus at 6:55 AM on September 22, 2021


Also, the world (read as the planet or the international economic system) will suffer if Iran has to climb out of the condition its in by extracting all that oil and gas so it can be burned.

Meanwhile Israel has a leading-edge finance and tech-heavy economy, much higher standards of education, living, etc.

We're really talking about relative deprivation here. Compare Israel (or Australia, or the US) and Iran on the various developed country indexes.

And, the thread is not about nukes. That's just the ultimate example of disparities in lethality and sophistication of forces.
posted by snuffleupagus at 7:10 AM on September 22, 2021 [1 favorite]


Mod note: Couple deleted; let's not steer off into a general thread about comparing Iran and Israel; thread is about this high tech assassination thing.
posted by LobsterMitten (staff) at 8:00 AM on September 22, 2021 [1 favorite]


It's not about nukes but it is about the way that states legitimate the whole "this guy is very bad, we are entitled to kill him, especially if we can kill him and no one else using sophisticated technology" thing.

"We are using sophisticated technology and therefore we didn't murder his wife or any bystanders" is used as a sort of back-formation to make "he is a scientist working for an enemy regime and therefore it's fine to kill him because bad guys" seem like a more acceptable argument.

I think this point is made explicitly or implicitly upthread, but AI is a philosophical weapon as much as a shooting one.
posted by Frowner at 8:01 AM on September 22, 2021 [2 favorites]


Fakhrizadeh lead the nuclear weapons program for a violently antisemitic group of theocrats that want to commit genocide. In a better world he'd have been put on trial for war crimes, but as it was he was a clear and present danger to the safety of the world. Of course it was OK to kill him: he was engaged in a conspiracy to kill literally millions of people. And given that killing him was justified, criticising the use of sophisticated weaponry to avoid civilian casualties is perverse.

There's a recent book by Dara Horn called People Love Dead Jews. And they do! Dead Jews are inspiring and tragic and sombre and people make movies and books about them. When Jews stand up for themselves, though, and when (heaven forbid) they actually take responsibility for their own survival, the censorious gentile world starts criticising them for preempting matters, not seeking consensus, doing things the wrong way; basically for not providing the moral clarity of becoming some more dead Jews. Sorry, but Jews are not your Pietà. This was a clear act of self defense and vastly more moral than practically any military action by any state, ever.
posted by Joe in Australia at 8:35 AM on September 22, 2021 [3 favorites]


On the philosophy of weapons and conflict note, I suspect we're having our thinking clouded by the same primate brain BS that gives us the idea of a "fair fight".

If I got into an argument with Mike Tyson then our primate brain tells us that a "fair fight" would be for me and him to put on boxing gloves and fight face to face.

I'd be 100% guaranteed to lose in such a "fair fight". So how is it fair?

It's "fair" because our primate brain says the bigger caveman is the best caveman so we are inclined to see something which allows a bigger, stronger, faster, person to train their skills that take advantage of their size, strenght, and speed and then pitting that person against a smaller, weaker, slower person who has trained skills other than fighting.

If I was stealthy, trained myself to take advantage of that, and konked Tyson on the head with a blackjack that would somehow **NOT** be fair.

We see this in the old Conan and similar sword and sorcery type stories.

Conan has a set of genes that allow him to train and become strong, fast, and skilled at hand to hand combat. Evil Wizard X has a set of genes that allow him to train and become smart, sneaky, and skilled in sorcery.

what does Conan say? "Stop hiding behind your spells and fight me like a man!"

And our ape brain says "yeah, fight fair Evil Wizard X discard every advantage you've spent your entire life training for and go fight Conan on terms that favor him and his lifetime of training!"

When 'Murca bombs the shit out of Iraq that's fair. They should fight the American military on the American military's terms and if they can't then that means they deserve to be conquered.

When the Iraqi resistance fights with IED's, sabotage, and other asymmetric warfare techniques that is decried as cowardly, terrorism, cheating.

Our primate brain tells us that the techniques that favor the strong are good, and the techniques that favor the weak are bad.

So 'Murca using assassin drones? Good. We're getting a little abstract here, but they're (currently) a tool available only to very wealthy nations with lots of expensive infrastructure. On an emotional level we'll see them as similar to fighter jets or cruise missiles or whatever. Cool weapons of the strong and therefore good and morally proper.

An Afghan who saw their family blown up and decides to retaliate by sniping some high ranking American politician or media figure? That we'd tend to see as bad, wicked, terrorism, because it's a tool available to the weak.

I'm not claiming that model is a perfect explanation for all human behavior when it comes to fighting, but if you make the prediction that tools of the strong will be seen as good and tools of the weak will be seen a bad you'll usually be right.

Not just today either, but through history.
posted by sotonohito at 8:41 AM on September 22, 2021 [2 favorites]


the censorious gentile world

tren zich

I have issues with lethal robots, so I must want more dead Jews? You have family in Israel? So do I. Really, how dare you.
posted by snuffleupagus at 8:51 AM on September 22, 2021 [6 favorites]


Do you not realise a nuclear missile is a killer robot.
posted by Joe in Australia at 8:59 AM on September 22, 2021


The Third Revolution in Warfare - "First there was gunpowder. Then nuclear weapons. Next: artificially intelligent weapons."
Autonomous weaponry is the third revolution in warfare, following gunpowder and nuclear arms. The evolution from land mines to guided missiles was just a prelude to true AI-enabled autonomy—the full engagement of killing: searching for, deciding to engage, and obliterating another human life, completely without human involvement.

Read next: What is a robot, really?

An example of an autonomous weapon in use today is the Israeli Harpy drone, which is programmed to fly to a particular area, hunt for specific targets, and then destroy them using a high-explosive warhead nicknamed “Fire and Forget.” But a far more provocative example is illustrated in the dystopian short film Slaughterbots, which tells the story of bird-sized drones that can actively seek out a particular person and shoot a small amount of dynamite point-blank through that person’s skull. These drones fly themselves and are too small and nimble to be easily caught, stopped, or destroyed.

These “slaughterbots” are not merely the stuff of fiction. One such drone nearly killed the president of Venezuela in 2018, and could be built today by an experienced hobbyist for less than $1,000. All of the parts are available for purchase online, and all open-source technologies are available for download. This is an unintended consequence of AI and robotics becoming more accessible and inexpensive. Imagine, a $1,000 political assassin! And this is not a far-fetched danger for the future but a clear and present danger.

We have witnessed how quickly AI has advanced, and these advancements will accelerate the near-term future of autonomous weapons. Not only will these killer robots become more intelligent, more precise, faster, and cheaper; they will also learn new capabilities, such as how to form swarms with teamwork and redundancy, making their missions virtually unstoppable. A swarm of 10,000 drones that could wipe out half a city could theoretically cost as little as $10 million...

The strongest such liability is moral—nearly all ethical and religious systems view the taking of a human life as a contentious act requiring strong justification and scrutiny. United Nations Secretary-General António Guterres has stated, “The prospect of machines with the discretion and power to take human life is morally repugnant.”

The strongest such liability is moral—nearly all ethical and religious systems view the taking of a human life as a contentious act requiring strong justification and scrutiny. United Nations Secretary-General António Guterres has stated, “The prospect of machines with the discretion and power to take human life is morally repugnant.”

Autonomous weapons lower the cost to the killer. Giving one’s life for a cause—as suicide bombers do—is still a high hurdle for anyone. But with autonomous assassins, no lives would have to be given up for killing. Another major issue is having a clear line of accountability—knowing who is responsible in case of an error. This is well established for soldiers on the battlefield. But when the killing is assigned to an autonomous-weapon system, the accountability is unclear (similar to accountability ambiguity when an autonomous vehicle runs over a pedestrian).

Such ambiguity may ultimately absolve aggressors for injustices or violations of international humanitarian law. And this lowers the threshold of war and makes it accessible to anyone. A further related danger is that autonomous weapons can target individuals, using facial or gait recognition, and the tracing of phone or IoT signals. This enables not only the assassination of one person but a genocide of any group of people. One of the stories in my new “scientific fiction” book based on realistic possible-future scenarios, AI 2041,[1] which I co-wrote with the sci-fi writer Chen Qiufan, describes a Unabomber-like scenario in which a terrorist carries out the targeted killing of business elites and high-profile individuals...

Where will this arms race take us? Stuart Russell, a computer-science professor at UC Berkeley, says, “The capabilities of autonomous weapons will be limited more by the laws of physics—for example, by constraints on range, speed, and payload—than by any deficiencies in the AI systems that control them. One can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenseless.” This multilateral arms race, if allowed to run its course, will eventually become a race toward oblivion.

Nuclear weapons are an existential threat, but they’ve been kept in check and have even helped reduce conventional warfare on account of the deterrence theory. Because a nuclear war leads to mutually assured destruction, any country initiating a nuclear first strike likely faces reciprocity and thus self-destruction.

But autonomous weapons are different. The deterrence theory does not apply, because a surprise first attack may be untraceable. As discussed earlier, autonomous-weapon attacks can quickly trigger a response, and escalations can be very fast, potentially leading to nuclear war. The first attack may not even be triggered by a country but by terrorists or other non-state actors. This exacerbates the level of danger of autonomous weapons...

The main roadblock today is that the United States, the United Kingdom, and Russia all oppose banning autonomous weapons, stating that it is too early to do so... Autonomous weapons are already a clear and present danger, and will become more intelligent, nimble, lethal, and accessible at an alarming speed. The deployment of autonomous weapons will be accelerated by an inevitable arms race that will lack the natural deterrence of nuclear weapons. Autonomous weapons are the AI application that most clearly and deeply conflicts with our morals and threatens humanity.
The RQ-180 Drone Will Emerge From The Shadows As The Centerpiece Of An Air Combat Revolution - "The Air Force's secretive, very stealthy, and high-flying drone won't be just a better spy aircraft, it will be a deeply networked game-changer." Bioterror: the dangers of garage scientists manipulating DNA [ungated] - "The availability of gene-editing tools such as Crispr has led to an explosion of unchecked DIY experiments in self-built labs."
Despite a lack of formal microbiological training, Dabrowa has successfully used faecal transplants and machine learning to genetically modify his own gut bacteria to lose weight without having to change his daily regime. The positive results he’s seen on himself have encouraged him to try to commercialise the process with the help of an angel investor. He hopes one day to collect as many as 3,000 faecal samples from donors and share the findings publicly.

Much of his knowledge — including the complex bits related to gene-editing — was gleaned straight from the internet or through sheer strength of will by directly lobbying those who have the answers he seeks. “Whenever I was bored, I went on YouTube and watched physics and biology lectures from MIT [Massachusetts Institute of Technology],” he explains. “I tried the experiments at home, then realised I needed help and reached out to professors at MIT and Harvard. They were more than happy to do so.”

[...]

These garage scientists might seem like a quirky new subculture but their rogue mindset is starting to generate consternation among those who specialise in managing biological threats in governments and international bodies.

In 2018 the states that are signatories to the 1972 Biological Weapons Convention (BWC) identified gene editing, gene synthesis, gene drives and metabolic pathway engineering as research that qualifies as “dual use”, meaning it is as easy to deploy for harmful purposes as it is for good.

Many of the parties are now worried that increased accessibility to such technologies could heighten accidental or deliberate misuse, including the development of biological weapons by rogue actors for mass or targeted attacks.

It’s a regulatory oversight that worries Dabrowa more than most. He’s spent years trying to warn officials and journalists about the growing capabilities of amateurs like himself. “I would go and meet ministers with a vial of cowpox and explain the threat,” he says, referencing the relatively benign pathogen that has been used since the days of Edward Jenner to help inoculate people against smallpox.

Among these threats are DNA-sequencing techniques accessible to him as a hobbyist that could easily be used to home-brew lethal pathogens like smallpox out of naturally occurring cowpox or other vaccine-based derivatives.

“If bioterrorists wanted to do it undetected, they could buy a second-hand DNA synthesiser for $2,000. The whole process would cost $10,000 and could be done in a kitchen,” he says.

[...]

‘Gain of function’

Getting the balance right between experimentation that encourages innovation but does not simultaneously cultivate risks has never been easy in the microbiological field. However, RP Eddy, whose consulting group Ergo has been providing pandemic-related intelligence to the Biden administration and other government agencies, says that while genetic tools may have tipped the risk balance, it’s important not to get overly consumed by the biohacking field.

Eddy points instead to some of the riskier research with dual-use potential that has been happening in formal academic institutions for years. Much of this is not reliant on modern genetic advances and occurs in far less controlled environments than many assume. These, he says, sometimes require little more than an automatic self-sealing door, gloves and an autoclave machine or air pump...

Among the riskiest processes include a research method known as gain of function. This involves purposefully tinkering with viruses to make them more infectious so that vaccines and therapeutics can be preemptively researched and developed.

The process first drew international scrutiny when Ron Fouchier, a virologist at the Erasmus Medical University in the Netherlands, successfully used the method in November 2011 to make the H5N1 flu more infectious and transmissible to humans.

To create the highly lethal pathogen, Fouchier had taken flu samples and used them to infect ferrets many times over, cherry picking specimens from the sickest ferrets to infect the next ones in line. It was the simplicity and cheapness of the process that concerned many...

For Dabrowa the measures do not go far enough. He would rather the international bioweapons community took a leaf out of the nuclear deterrent field and identified supply chain chokepoints that can be monitored more robustly.

“We need a part of the process that is expensive, difficult and necessary,” he says, advocating for control of a key input material for DNA processing called nucleoside phosphoramidite. “It’s the equivalent of asking why someone in Afghanistan just put an order online for weapons-grade uranium.”
posted by kliuless at 5:55 AM on September 24, 2021 [2 favorites]


A quintessentially Russian take.

But autonomous weapons are different. The deterrence theory does not apply, because a surprise first attack may be untraceable.


The deterrence theory does not apply because as seen above, and in many other videos on YT for anyone who cares to plumb the depths, this tech has already filtered all the way down in crude forms and so will more sophisticated versions. On wings or on wheels.
posted by snuffleupagus at 5:57 AM on September 24, 2021 [2 favorites]


I don't think it's fair to dismiss the fact that Israel has a significantly higher GDP per capita, but a larger population is in itself a source of national strength.

I find it pretty wild to make appeals re Israel's population size when the state is actively disenfranchising a number of should-be residents equal to ~half again its population of nationals.
posted by dusty potato at 11:18 AM on September 24, 2021 [2 favorites]


Fakhrizadeh lead the nuclear weapons program for a violently antisemitic group of theocrats that want to commit genocide.

Israel’s dominant Likud party is also a bunch of genocidal zealots who have been engaged in a slow roll of ethnic cleansing in the name of self defense since the late 1960s. Israel also has nuclear weapons.

If Hamas sent a robot assassin into Israel to kill one of Israel’s leading nuclear scientists and then was running full page articles in their newspapers crowing about the success operation, we don’t even need to imagine what would happen.— Israel would quickly flatten several blocks of apartments in Gaza. It has become such a common response to any action from Gaza that members of the IDF call it “mowing the lawn” — as if the uppity residents of Gaza are just blades off grass to be periodically cut back.

The use of violence instead of diplomacy has not worked. Israel is not any safer today than it was at its founding. Instead of seeking a new path or trying something different — the answer seems to be more violence — this time with robots so they can dehumanize and distance themselves further from the horrors done by the state in their name.

At least the Israeli kids doing their mandatory military service won’t be traumatized by having to shoot or drop bombs on some Palestinians who were press ganged into Hamas militias as a teenagers.
posted by interogative mood at 7:41 PM on September 24, 2021 [1 favorite]


It's the kneecap sniper story that comes back to me. The limits there, to the extent any were found, were in the humans in the loop. In the moment, and in what they had to say later. And that is what these kinds of projects want to engineer out: Compunction. Remorse. Conscience.

Humanity. The robots will not write poems about the Somme. Not any that we will understand, anyway.
posted by snuffleupagus at 9:26 PM on September 24, 2021


Israel’s dominant Likud party is also a bunch of genocidal zealots […]

A normal person wouldn’t use this sort of hyperbole against what is, after all, just another political party in a relatively liberal nation; one which could presumably have actually carried out its ”genocide” if that desire were actual and not an antisemitic fantasy. In contrast, Iran’s nuclear chief was literally developing genocidal devices to carry out genocide on behalf of a genocidal regime that boasts about its plans to commit genocide.

Also, an even vaguely well-informed antisemite would have noticed that Likud, having failed to win any of the last five national elections, is no longer in power, and would have updated their rhetoric accordingly.
posted by Joe in Australia at 2:37 AM on September 25, 2021


—Israel’s dominant Likud party is also a bunch of genocidal zealots...

A normal person wouldn’t use this sort of hyperbole against what is, after all, just another political party in a relatively liberal nation


Lol you must have skipped out on every single MeFi thread about US Republicans during the past six years. The difference being: no one that I can recall ever got offended on the Republicans’ behalf when people rightly called them out on being genocidal zealots.
posted by Atom Eyes at 3:18 AM on September 26, 2021 [3 favorites]


If nuclear weapons are genocidal devices; then why does Israel have them? If Likud is just another political party, and not the dominant one, sure it lost the last 5 elections, yet in the aftermath of 4 of those elections they still held the Prime Minister's office. For the last 20 years the PM has either been the current Likud leader or an ex-member of the party. The current PM a former Chief of Staff for Netenyahu. An individual who has said that he is to the the right of the last Prime Minister and whose political coalition(Yamina) demands as its second article: "Unity of the land: We are the only party that opposes the establishment of a Palestinian state and any withdrawal from the territories of the Land of Israel. We will work to develop settlements throughout the country.".
posted by interogative mood at 2:54 PM on September 26, 2021 [2 favorites]


I'd be happy to reply, but my comments are silently deleted.
posted by Joe in Australia at 7:14 PM on September 26, 2021


Mod note: Your comments are not silently deleted, they were deleted with a very clear mod note. Also, if you don't want other people to conflate the state of Israel, the current government of Israel, and Jews more generally, it would help a whole lot if you yourself didn't do that to accuse other MeFites of being antisemitic when they criticize current or past Israeli governments.

For example: "an even vaguely well-informed antisemite would have noticed that Likud, having failed to win any of the last five national elections [...]"

Even a vaguely well-informed observer of Israeli politics would have noticed that Likud was in the ruling coalition for a record-breaking 12 years, followed by a series of four deadlocked votes and then a coalition that everyone expects to collapse in short order as its only reason for being is "turf out Netanyahu" and it cannot agree on any tiny piece of national policy otherwise. Likud has been part of the ruling coalition for 32 of the 48 years since its founding; taking Likud seriously as a political party in Israeli politics is not antisemitic.

I am 100% certain that people will say antisemitic things about Likud, in which case, you need to flag them and we'll happily delete. Antisemitism is not acceptable at MetaFilter. You have flagged nothing in this thread; therefore, we are forced to assume you have no objection to any particular comments, but would prefer to use them as jumping-off points for your own rhetorical stances.

We have told you repeatedly in the past that if something is antisemitic, we need you to flag it or inform us at the contact form, so we can delete it. You seem to prefer to let comments you claim are antisemitic stand, so you can use them as a rhetorical staging ground for your responses. You cannot have it both ways. If something is antisemitic, you need to flag it or use the contact form, and not respond to it. If we delete the comment you find antisemitic (but did not bother to flag), and you have already responded, you cannot then complain that your response was deleted when we deleted the original comment you objected to.
posted by Eyebrows McGee (staff) at 8:57 PM on September 26, 2021 [7 favorites]


On topic: we're coming back to the idea of drones displacing infantry for almost all tasks and that's going to result in a total change in land war.

Drones, once they mature, are going to be as big a change as Dreadnaught was, or the introduction of effective carriers and torpedo bombers.

Modern drones aren't there yet, they're basically glorified RC planes right now even the fancy one that Israel just deployed. But Moore's Law marches on and we're rapidly approaching the era when drones can start going at least semi-autonomous.

Just as the only thing that could actually counter a Dreadnought type battleship back in the era of battleships was another Dreadnaught type ship the only thing that can actually counter a mature and serious drone deployment is another drone deployment. You can't fight drones with infantry, air force, or naval power.

The moral issues are important, but I worry that we haven't looked hard enough at the social side of things.

The invention of modern guerrilla war was critical in ending the era of imperialism and restoring at least some semblance of force parity between the wealthy nations and the poorer nations. Not that exploitation didn't continue, but it couldn't be the sort that we saw in the late 1800's and early 1900's.

The biggest military factor of the modern era isn't really nukes, it's guerrilla war. Short of genocide it is impossible for an invading nation to defeat a guerrilla movement with popular support. The guerrillas can't win, not in the military sense, but they can make it so expensive and demoralizing that eventually the occupiers just give up and leave.

A mature drone technology may well change that.

it's also going to change the calculus of power in the developed and wealthy nations. Prior to Maurice of Nassau's invention of drill and modern infantry tactics a ruler needed to command the loyalty of a relatively small group of elite aristocrat/warriors. After infantry became more than just a few peasant levies to bulk out a force of elites the rulers had no choice but to secure the loyalty of a much larger group. And the only real way to do that was to extend greater political liberty, spending on infrastructure for that larger group, and allowing them greater economic participation.

Factory work had a similar effect: it forced the rulers of advanced nations to pay attention to, spend money on, and give more liberty to the average person.

As we shift from an infantry focused model of war to a drone focused model of war the rulers will only need to secure the loyalty of a smaller portion of the citizenry. Not an aristocratic elite warrior class, but rather a handful of engineers, programmers, and technicians.

Infantry may not go away completely, but it will become (again) a lesser branch of limited importance.

And mature drone technology coupled with the sort of panopticon that modern computing and tracking can produce may render guerrilla movements impossible to maintain.

Assuming the technology works out, and there's no reason to think it won't, drone warfare is inevitable. As is increasing industrial automation.

Historically the ruling class has never granted liberty or comfort to people they weren't forced to.

If all they need are a handful of programmers, techs, and engineers (and there's plenty of those who are hard right and very much fans of being a new aristocracy lording it over everyone else), then everyone else is unnecessary.

If they don't need you for armies or factories how do you force them to give you material comfort and political/social liberty?

I don't have an answer, but I think the question is as critical as the moral issue of (semi) autonomous killing machines.
posted by sotonohito at 9:18 AM on September 27, 2021 [4 favorites]


I’m skeptical that the complex global supply chains needed to get the batteries, microprocessors, motors and other components for kill bots will be possible without a level of political stability needed to keep the factories running. Also I don’t think the panopticon promised by modern computing has ever been successfully delivered despite some very large efforts by the USA and China. Sure China has its vaunted social credit system great firewall and surveillance State; but the real way they keep a lid on things is a massive bureaucracy loyal to the communist party, an even bigger military, and lots of corruption / throwing money around. The US tried in Iraq and Afghanistan and it failed the moment the infantry pulled back. Israel has been trying to use this in Gaza for over a decade and it hasn’t stopped periodic waves of violence. It is impossible to win a war without boots on the ground.
posted by interogative mood at 5:17 PM on September 27, 2021


Though it mostly re-hashes what is available in the article, Lawfare's podcast talks about this. Specifically what 'AI' did and did not do ((spoiler) AI was used to assist in reconciling Satellite-communication Lag between the gun/s and the shooter.) Also clarifying the objective was to kill _only_ the one person.

The mini drones are cool: for possible end-outcomes I recommend Lem's wildly upbeat novel "The Invincible"
posted by From Bklyn at 6:17 AM on September 29, 2021


interogative mood Remember I'm talking about technology not yet developed fully, not the stuff available off the shelf now. Stuff available in 10 to 20 years.

The US never tried implementing a full panopticon in any nation because the tech isn't there.

No one has drone swarms yet because the tech isn't there.

I don't argue that infantry will be totally, 100%, completely, eliminated. At least not for a while.

But it's like cavalry. It's going to become increasingly less relevant as time passes and tech progresses. Heck, we've got a few horses in the military even today and real use for them not just parade ground stuff. But they're really weird little edge cases not the major factor they used to be.

And I'm presuming the political willingness to be pretty damn brutal with bloody hands at arm's length. Which I think we have given how readily our politicians deploy the current generation of murdering bystanders by the dozen and missing your official target really often style drones.

Drones that can be fairly assured of only killing the specific person you actually want to kill and leaving bystanders merely traumatized rather than dead? I can't see any US President turning that down.

Less than $100 today can buy a cheap little drone with a 20ish minute battery life and enough reserve lift to carry an air gun loaded with cyanide tipped darts or a lethal load of C4 for suicide bombing. I own one. It's big and clunky compared to what I'm thinking we'll have in 20 years for a tenth the cost.

At $10/drone you can field 200,000 for the price of a single cruise missile. Or heck, at even $100/drone you can field 20,000 for the price of a single cruise missile.

I can see 'Murca driving up with mobile registration stations and leting everyone know that in, say, three days anyone **NOT** wearing a US Safety Tracker will be targeted and killed by drones. Panopticon and drone kill list in one conveniently remote but brutal package.

CNN would show our brave troops helping the people of [insert random Muslim nation here] protect themselves from the wicked evil insurgency.
posted by sotonohito at 7:46 AM on September 29, 2021 [1 favorite]


It is impossible to win a war without boots on the ground.

Which is why airborne drones can't win a war, but can forestall an opponent from winning it as well; and also why the push to normalize ground drones.

The best we can hope for may be an amendment to the conventions on warfare limiting what can be deployed autonomously (i.e. rather than remotely operated).
posted by snuffleupagus at 9:21 AM on September 29, 2021


I’m skeptical that the complex global supply chains needed to get the batteries, microprocessors, motors and other components for kill bots will be possible without a level of political stability needed to keep the factories running.

Yeah, when I was looking up information about the Turkish Bayraktar TB2 I noticed that it's absolutely not a made-in-Turkey production, it relies on a global supply chain as much as anything else. But then, that's probably somewhat true for stuff like cruise missiles.
posted by BungaDunga at 12:53 PM on September 29, 2021


Also worth noting, as the title of this thread cites, what happens at war comes home.

Remember how CBP used one of their Predator drones to buzz the BLM protests in Minnepolis? Which brings up the questions of why CBP had a bloody Predator drone to begin with, why CPB was operating it in a US city, and why anyone thought CBP should have anything to do with a BLM protest...

Tools of war come home and become tools of oppression.

Take a nice little sniper drone like this one. Those will come home and wind up in police hands. The first uses will be very good for TV, they'll use it to snipe a person with hostages or something similarly dramatic where they can easily claim that while the loss of life was unfortunate it was necessary to save the victims of crimes. Same as they did about SWAT back when it got started, before they started just using it for grins and giggles on jaywalking level crimes.

Give people a toy, they'll look for an excuse to use it.

How about a swarm of even a few thousand drones with tazers or pepper balls? Now they can "disperse the crowd", which apparently is a function of such vital importance it justifies any and all violence, without having human police involved at all.

I'd give you long odds that violence meted out by impersonal AI driven police drone will be accepted by the white moderates much more willingly than they accept violence from actual human police. With drones it'd be all high tech institutionalized and nothing personal, easy to ignore easy to think of as inherently just and impossible to abuse.

Kettle the crowd and don't even bother holding them, except for the intimidation and humiliation factor, just tag 'em with US Freedom & Safety Tags, totally justified since you're trying to track known rioters and keep them from gathering and rioting again, right?

Find a high crime area, where of course "high crime" translates as "majority Black", and for the good of the citizens of course, mandate that everyone in the special Crime Prevention Zone must wear a tracking tag, a special Police and Safety Tag which has a built in panic button so if you're the victim of a crime you can hit the button and call in some of the roaming drones, see its for their own good! And it'll totally be temporary. Totally.

And of course anyone not wearing a tag in the Special Crime Prevention Zone is clearly a criminal who will be surrounded by drones and told to surender, stop resisting, or the drones will taze them.

The Trump crowd would cheer and swear it was the most patriotic American freedom loving thing to have ever happened, saving all those Bla- I mean "inner city" people from their own savagery and barbarism! How bold and brave of our noble police!
posted by sotonohito at 1:58 PM on September 29, 2021 [4 favorites]


There are already teargas drones
posted by BungaDunga at 2:26 PM on September 29, 2021 [2 favorites]


The tech is never going to be there. Classifiers (image recognition, sentiment scoring, etc) all have error rates and when you scale up to a few million people the number of false positives and false negatives compound until it’s just garbage. Improbable events happen routinely because the sample set is so large.

Even systems like payment cards (credit / debit cards) where the banking industry can observe everything going on in terms of who allegedly bought what, where and what level of id verification was provided at payment, etc —Even in this much more constrained problem domain fraud is still a huge problem.

It is like expecting that one day we’ll have a machine that can tell us the exact speed and position of atomic particle. Or that we can calculate Pi to the last digit.
posted by interogative mood at 7:52 PM on September 29, 2021


At $10/drone you can field 200,000 for the price of a single cruise missile. Or heck, at even $100/drone you can field 20,000 for the price of a single cruise missile.

Is that cruise missle running off RF?
posted by clavdivs at 8:03 PM on September 29, 2021


Find a high crime area, where of course "high crime" translates as "majority Black", and for the good of the citizens of course, mandate that everyone in the special Crime Prevention Zone must wear a tracking tag, a special Police and Safety Tag which has a built in panic button so if you're the victim of a crime you can hit the button and call in some of the roaming drones, see its for their own good! And it'll totally be temporary. Totally.

They already do stuff not totally unlike this in certain neighborhoods and housing projects over short bursts - and believe me, living right near where the 3rd precinct was burned down last year, I can tell you that things can get weird really fast. The willingness exists and the technology to do a scaled down version will exist soon - maybe not everyone everywhere but some of us sometimes.

Police can and do cut off certain blocks or neighborhoods and demand that everyone entering them provide ID. They can and do refuse entry to non-residents. Subsidized housing is often run this way - no visitors, everyone has to show ID, etc. This happens in inner city neighborhoods, of course, not in affluent or majority-white places, but it does happen and it would not surprise me in the least if everyone in certain neighborhoods or buildings were forced to carry a tracker in the name of "stopping crime".

Like, last summer there were helicopters and planes overhead all the time and there are still regular surveillance flights. They cut off mail delivery to my neighborhood for a week, which was bad from a "getting prescriptions delivered in the pandemic" angle. Do not underestimate how weird and bad things can go.

~~

Minneapolis, in fact, is working on a project where they ask you to download a tracking app "for public transit development". Now, I have no doubt that this is still basically innocent, but it's also fucked up - if people won't get a free, life-saving vaccine because they're afraid that it is intended to sterilize or kill the poor, they're certainly not going to download a "tracking app" from the city. Which means that of course the data they get about where bus routes are needed won't be any good, because it will reflect the needs of the white professional classes who are already heavily catered-to. Further, they could just survey people in dense neighborhoods - we ride the bus and we are intimately familiar with which routes are packed, unsafe, infrequent (clue - the ones that go to/through poor neighborhoods).

But the point is, they are already thinking about tracker apps. I've lived in this city for most of my adult life and I've seen it get worse and more racist and more heavily policed, and I would not be surprised if a bunch of genuinely innocent tracker apps build up into a network of tracker apps that are mandatory for everyone or mandatory for certain zip codes. They target activists around here - sometimes seriously, like "let's get that individual because of their high profile" and sometimes in a general way, like "let's fuck with someone, what about that guy who called out police racism". That's probably what got Winston Smith murdered this summer - the cops were angry over the prosecution of Chauvin, wanted to hunt someone down and picked him because he'd been critical of the police.
posted by Frowner at 6:29 AM on September 30, 2021 [5 favorites]


The tech is never going to be there. Classifiers (image recognition, sentiment scoring, etc) all have error rates and when you scale up to a few million people the number of false positives and false negatives compound until it’s just garbage.


Are you under the impression that current policing and prosecutorial systems and tactics must be somehow better in terms of their error rates to persist?

Because I may have some bad news
posted by snuffleupagus at 6:57 AM on September 30, 2021


Yes and when you consider the extremely high rates of unsolved crimes this supports the point I’m making. More data for police departments has had very little
impact on crime rates, or rates of crimes solved. And I’m not even going to go into the tangent of how police would deal with anyone proposing to replace
them with computers and robots.
posted by interogative mood at 7:42 PM on October 1, 2021


I don’t see today’s cops having a big problem with the ladder being pulled up behind them to keep their retirements funded. Nobody needs to get fired for the composition to change over time. They’ll romanticize being The Last Real Cops and lecture us about how this is what we got for cop hate, etc
posted by snuffleupagus at 7:47 PM on October 1, 2021


I've been staying out because I don't want to make this all about me, but...

Nothing I proposed people could do with drones actually requires accurate facial (or other) ID. It'd be a nice bonus, sure, but not essential to do oppression via drone. I was talking about using force to compel people to wear tracking tags and that drone targeting would be as simple as targeting anything human without a tracking tag.

And heck, you could also have your drones demand fingerprints, or since we're talking 20-ish years from now, a DNA ID based on shed skin or even a little diabetes reading type pinprick. Refusal to submit to through identifying measures is taken as evidence of guilt and you're tazed and arrested or killed depending on the rules of engagement set for the drones.

"PRESENT YOUR FINGERPRINT FOR IDENTIFICATION YOU HAVE 20 SECONDS TO COMPLY"

Once everyone is tagged for tracking then spotting associations gets a lot easier. Not 100% guaranteed, dead drops are a thing after all, as are "chance" meetings in big public spaces, but it'd make organizing really damn hard. And the untagged are enemies of the state to be killed out of hand or, if you're a nice oppressor, captured and interrogated then tagged before release (or execution).

If you think like an occupying power rather than what a police force is supposed to be like most of the objections you raise are solvable. Again, I'm not claiming that drone swarms represent a total solution for all your oppressive authoritarian needs. Just that as a tool in the box they make being an oppressive authoritarian vastly simpler and more doable.

And I do think they'd mean the end of guerrilla war as an effective means of opposing an occupying army. If you're willing to be as brutal as the US was in Vietnam, or even Iraq and Afghanistan, then drones and tags can break any guerrilla army pretty quickly.

After all, we're talking about an occupying force that when using modern drones appears to have killed ten or so innocent people for every actual guerrilla they killed. An occupying force so indifferent to actually targeting real insurgents that it simply declared that any man it killed was an insurgent.
posted by sotonohito at 12:12 PM on October 2, 2021 [1 favorite]


« Older Old Peak Blows   |   A horror house of abuse and neglect Newer »


This thread has been archived and is closed to new comments