‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
April 4, 2024 6:37 AM   Subscribe

 
We've built skynet, but it turns out we didn't need computers to see "all humans [across a certain line] as a threat"
posted by nutate at 6:42 AM on April 4 [3 favorites]


That's the action of a terrorist state.
posted by seanmpuckett at 6:46 AM on April 4 [36 favorites]


For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives — including police and civil defense workers, militants’ relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.
Including, presumably, anybody who needs to coordinate with a lot of people to organize complex logistics.

Like World Central Kitchen staff, for instance.
posted by flabdablet at 7:06 AM on April 4 [37 favorites]


Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.”

Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A., an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

This is comic book supervillain shit.
posted by charred husk at 7:22 AM on April 4 [33 favorites]


Including, presumably, anybody who needs to coordinate with a lot of people to organize complex logistics.

Like World Central Kitchen staff, for instance.


No, I think that was absolutely on purpose.
posted by Gadarene at 7:36 AM on April 4 [21 favorites]


> That's the action of a terrorist state.

By definition, at this point.
posted by Godspeed.You!Black.Emperor.Penguin at 7:42 AM on April 4 [4 favorites]


How is it possible that I am still shocked at this point.
posted by joannemerriam at 7:49 AM on April 4 [9 favorites]


It's been pretty clear for quite some while that "Hamas uses civilians as human shields" is IDF code for "Palestinians live in homes and have families".

Likewise, the current IDF line is that it did not intend to target welfare workers. Which, given that the WCK attack clearly was targeted - three cars hit one at a time by precision munitions - and that WCK had sought and been given IDF permission to drive those three cars along the route where the IDF subsequently bombed them, and that the WCK name and logo were prominently displayed on the car roofs, gives rise to the question of just how those particular cars ended up "misidentified" as belonging to anybody but welfare workers.

Indiscriminate use of shitty ML tools for "generating" targets is one completely plausible explanation.

"Mistakes happen in wars", they say. So comforting to learn that those mistakes now have a sound statistical basis.
posted by flabdablet at 7:50 AM on April 4 [27 favorites]


Using AI to select military targets is literal inhumanity.
posted by mrjohnmuller at 7:50 AM on April 4 [16 favorites]


But just think of the time it saves!
posted by flabdablet at 7:51 AM on April 4 [9 favorites]


Parents of Quebecer killed in Gaza say Israeli strike was 'targeted killing of aid workers'

I missed brundlefly's post by a few minutes as I prepared a separate post that focused a bit more on the deaths of humanitarian aid workers in Gaza. But this post is perfectly adequate to share the same info.

World Central Kitchen demanded an independent investigation into the Israeli strikes that killed seven of its staff in Gaza
posted by elkevelvet at 7:52 AM on April 4 [8 favorites]


Likewise, the current IDF line is that it did not intend to target welfare workers. Which, given that the WCK attack clearly was targeted - three cars hit one at a time by precision munitions - and that WCK had sought and been given IDF permission to drive those three cars along the route where the IDF subsequently bombed them, and that the WCK name and logo were prominently displayed on the car roofs, gives rise to the question of just how those particular cars ended up "misidentified" as belonging to anybody but welfare workers.

This horrible case is one where the official explanation that all of the above was an accident is, in addition to being world-class-level implausible, somehow worse that an admission that the aid workers were simply murdered.
posted by Gelatin at 7:52 AM on April 4 [10 favorites]


Looking at the civilian protection, firing at a hostile in a house qualifies as a military objective and discriminates between civilians and militants. What remains shielding the civilians in that context is the principle of proportionality - "clearly excessive" damage to protected civilians for the military objective attained.

I am having difficulty finding decent precedent for what is "clearly excessive" in this context and which is not.

There is apparently a line of reasoning that if a force uses human shields, ignoring them in a locally disproportionate way could be (as a policy) be legal under international law in order to make the deliberate use of human shields tactically ineffective:

https://cjil.uchicago.edu/print-archive/proportionality-customary-international-law-argument-against-aspirational-laws-war

that is insanely dark. The logic is that the "clearly excessive" damage need not be based on immediate military objectives, but rather on much non-local ones (in this case, the wider "legitimate military" objective of not being tactically hamstrung by the enemy's use of human shields).

Gah.

But in the end, I can't find any concrete cases where "clearly excessive" is delimited, but I am not a lawyer maybe they exist. I can only find people talking about it in abstract terms.
posted by NotAYakk at 8:01 AM on April 4 [2 favorites]


How is it possible that I am still shocked at this point.

I keep thinking there are depths to which humanity will not sink BUT YET.
posted by corb at 8:12 AM on April 4 [6 favorites]


world-class-level implausible

Personally I find the idea of some overworked, overstressed IDF targeting operative getting all Computer Says No when tasked with choosing whether any given Palestinian should live or die to be absolutely plausible.

There's an overwhelming indifference at work there, one that fits perfectly with everything I've ever learned about how human beings treat other human beings they've allowed themselves to be persuaded are subhuman. Plus, the relief of feeling able to sheet home the blame for Mistakes Were Made to a machine, rather than taking personal responsibility, would be considerable.

When Israel eventually gets its day in court, I am quite confident that the defence team will attempt to deflect blame for the WCK attack and countless similar atrocities onto "unfortunate" "operational" "over-reliance" on "unexpectedly deficient" "artificial intelligence".

I mean, let's be completely fair here. Who could possibly have known ahead of time that storing huge piles of petrol-soaked rags right next to the furnace might be dangerous?
posted by flabdablet at 8:22 AM on April 4 [7 favorites]


I keep thinking there are depths to which humanity will not sink

Watching White House spokesbots respond to press questions on Gaza has completely disabused me of any such notion.
posted by flabdablet at 8:26 AM on April 4 [12 favorites]


Personally I find the idea of some overworked, overstressed IDF targeting operative getting all Computer Says No when tasked with choosing whether any given Palestinian should live or die to be absolutely plausible.

The implausible idea us that the IDF "accidentally" targeted a clearly marked humanitarian aid convoy, which had previously communicated its intent to the authorities, three times.
posted by Gelatin at 8:29 AM on April 4 [14 favorites]


I am having difficulty finding decent precedent for what is "clearly excessive" in this context and which is not.

Seriously, anybody who can look at Gaza today and still feel a need to search for precedent in order to get clarity on whether what Israel is doing there is or is not "clearly excessive" has lost the fucking plot.
posted by flabdablet at 8:32 AM on April 4 [15 favorites]


This makes me physically sick
posted by rubatan at 8:41 AM on April 4 [5 favorites]


Saw this on Mastodon this morning. Boosted by Cory Doctorow. Supposedly written to a newspaper 10 years ago:

letter to the editor of an undetermined newspaper titled "Israel's style of public relations"

SIR - A quick guide to Israel's PR methods:

We haven't heard reports of deaths, will check into it;

The people were killed, but by a faulty Palestinian rocket/bomb;

OK we killed them, but they were terrorists;

OK they were civilians, but they were being used as human shields;

OK there were no fighters in the area, so it was our mistake. But we kill civilians by accident, they do it on purpose;

OK we kill far more civilians than they do, but look at how terrible other countries are!

Why are you still talking about Israel? Are you some kind of anti-Semite?

Test this against the next interview you hear or watch.

(signed) Adam Johannes, Secretary, Cardiff Stop the War Coalition

posted by night_train at 8:42 AM on April 4 [28 favorites]


Civilians, aid workers, journalists

Israel is showing us the future, if we let this happen (if we keep letting this happen)
posted by elkevelvet at 8:47 AM on April 4 [8 favorites]


social media enabled and intensified the rohingya genocide.
ibm mainframes enabled and streamlined the holocaust.

seems fitting that ai/ml would be bent toward helping humans be inhuman faster, more efficiently, and at greater scale as quickly as possible.
posted by i used to be someone else at 8:48 AM on April 4 [6 favorites]


The implausible idea us that the IDF "accidentally" targeted a clearly marked humanitarian aid convoy, which had previously communicated its intent to the authorities, three times.

especially when this is not the first, second, third, or even fourth instance of the idf hitting humanitarian aid convoys
posted by i used to be someone else at 8:49 AM on April 4 [8 favorites]


The implausible idea us that the IDF "accidentally" targeted a clearly marked humanitarian aid convoy, which had previously communicated its intent to the authorities, three times.

The IDF is a complex bureaucracy, not a monolith. To me, the implausible idea is that it has ever acted in a way that would justify an assumption of basic internal comms competence. Expecting its official permits people to have any meaningful input into the activities of its targeting cowboys strikes me as wildly optimistic, especially if that input might reduce the rate at which it can kill Palestinians. But I'm sure they're looking into it. Again.

The PR talking points identified above by Adam Johannes have been fully internalized by IDF commanders for decades at this point. I'm quite convinced that the people who reliably spout that shit have absolutely talked themselves into believing every word of it.
posted by flabdablet at 8:52 AM on April 4 [8 favorites]


This is comic book supervillain shit.

Seriously. Did some guy in Israel see The Winter Soldier and think, "hey, that mass murder algorithm seems nifty!"
posted by SPrintF at 9:18 AM on April 4 [7 favorites]


If SkyNet happens it'll be because the machines are sick of us killing each other. I am pretty settled on the idea that humanity needs a non-human keeper because our unique combination of intelligence, status seeking and testosterone is proven time and again incompatible with peaceable living.
posted by seanmpuckett at 9:23 AM on April 4 [3 favorites]


Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants.

It's worth noting that this is ~5x worse for civilians than the ratio of KIA in the (rightly!) universally-condemned terrorist attacks on 10/7.
posted by rishabguha at 9:46 AM on April 4 [8 favorites]


Has there been any sort of push towards a global arms treaty against the proliferation of AI murder technology? In the same way as there are conventions against the use of cluster munitions? It seems like they have a comparably indiscriminate approach to civilian casualties.
posted by mrjohnmuller at 9:51 AM on April 4 [4 favorites]


This is exactly how AI (or "AI") is intended to be used - as a scrim to protect against accountability. "Oh, we didn't [kill those people/deny that person healthcare/exclude that marginalized group from housing], the AI did it, too bad our AI is borked but how could we be expected to predict that?" And of course, the AI was created to make these bad decisions.

The whole point of having this program is to provide a scrim between the IDF and an increased, more terrifying, more criminal number of killings - the purpose is to kill more people. It's not a system gone awry or just a problem with AI, the goal is to kill and kill and kill.
posted by Frowner at 9:57 AM on April 4 [29 favorites]


And of course you've got to have AI, it's more efficient and modern, etc; no one ever says "maybe if AI does all these bad things all the time you shouldn't use AI". "Everyone must use AI" is religious capitalism, religious colonialism.
posted by Frowner at 9:59 AM on April 4 [11 favorites]


Yeah, it’s technowashing. The warcrime computer is a scapegoat, and a red herring.
posted by rodlymight at 10:06 AM on April 4 [10 favorites]


I was trying to pull together a post on this article but was dragging my heels because the whole thing is so sickening. But, in its truly awful way, it might be the most important article on AI we've seen posted this year... so thanks, brundlefly. A couple more pull quotes:

Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.

The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.

This was appearing a day after articles about how Amazon has been dropping "just walk out" technology from its brick-and-mortar stores which it turns out had "more than 1,000 workers in India monitoring cameras to review customer purchases".

The checkouts that seem automated are actually humans and the humans choosing which people to bomb are actually automatons. With choosing to let the automatons make those choices being the even bigger crime.
posted by rory at 10:08 AM on April 4 [11 favorites]


Frowner, this post (substack if you want to avoid) talks about that exactly. Certainly a useful thing to keep in mind the more we hear about model-optimized murder.
posted by bxvr at 12:23 PM on April 4 [2 favorites]


Isn't what Israel doing with Lavender essentially just automating the data collection to death sentence process the US pioneered with the Phoenix Program in Vietnam?
posted by Dalekdad at 1:06 PM on April 4 [3 favorites]


sure. or what ibm helped the third reich do in collecting data to identify who the "undesirables" were
posted by i used to be someone else at 1:43 PM on April 4 [7 favorites]


There are ostensible reasons to automate target selection, then there's the actual one:
Describing human personnel as a “bottleneck” that limits the army’s capacity during a military operation, the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”
___
B. said that the reason for this automation was a constant push to generate more targets for assassination. “In a day without targets [whose feature rating was sufficient to authorize a strike], we attacked at a lower threshold. We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly.”
___
“There was hysteria in the professional ranks,” said D., who was also drafted immediately after October 7. “They had no idea how to react at all. The only thing they knew to do was to just start bombing like madmen to try to dismantle Hamas’ capabilities.”
___
“You don’t know exactly how many you killed, and who you killed,” an intelligence source told Local Call for a previous investigation published in January. “Only when it’s senior Hamas operatives do you follow the BDA procedure. In the rest of the cases, you don’t care. You get a report from the air force about whether the building was blown up, and that’s it. You have no idea how much collateral damage there was; you immediately move on to the next target. The emphasis was to create as many targets as possible, as quickly as possible.”
In other words, Israel is not armed with a hammer, looking to hit nails - it has a hammer, wants to smash everything, and just needs something to label everything a nail so they can pretend they were trying to hit a nail. What the individual target generating system does is almost irrelevant, what matters is how the whole system works.

By the way, in an actual war where your opponent poses a military threat, "too few targets" is never a problem.
posted by ndr at 4:26 PM on April 4 [28 favorites]


(fwiw, the Gaza thread is still active and yeah we're talking about this AI as well in the context of other attacks on aid workers and people in general and Israeli startups aiming to use this round of violence as promotion for their tech offerings)
posted by cendawanita at 5:48 PM on April 4 [9 favorites]


Also don't forget, as referred to in the article, while Lavender sorts out the people to kill, Gospel sorts out which buildings to strike:
‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza - Permissive airstrikes on non-military targets and the use of an artificial intelligence system have enabled the Israeli army to carry out its deadliest war on Gaza, a +972 and Local Call investigation reveals.. This was out in November.

Compared to previous Israeli assaults on Gaza, the current war — which Israel has named “Operation Iron Swords,” and which began in the wake of the Hamas-led assault on southern Israel on October 7 — has seen the army significantly expand its bombing of targets that are not distinctly military in nature. These include private residences as well as public buildings, infrastructure, and high-rise blocks, which sources say the army defines as “power targets” (“matarot otzem”)

[...] According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”

According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.

In the majority of cases, the sources added, military activity is not conducted from these targeted homes. “I remember thinking that it was like if [Palestinian militants] would bomb all the private residences of our families when [Israeli soldiers] go back to sleep at home on the weekend,” one source, who was critical of this practice, recalled.

posted by cendawanita at 5:58 PM on April 4 [2 favorites]


Written and deleted a few responses. This is a function of Israeli leadership. In other countries, before you pull the trigger on a target you have to ask yourself “If I do this, will I go to jail?” With the right leadership and legal framework in place, these tools are quite useful in a fast paced combat environment. I hear people complain all the time about the US rules of engagement, and while they’re not perfect, they’re better than nothing.

In Gaza with Bibi, these tools simply prioritize how you blow stuff up. The IDF isn’t looking at a screen and going “Is this an enemy tank or not?”; it seems more like “that’s something moving that isn’t one of ours so take it off the playing field.” Bibi isn’t going to hold anyone accountable on any level whatsoever, so this is going to continue until Israel replaces him.

And, just to make everyone feel worse, I assure you there is a team of people looking at footage and comparing it against their models so that in the future, the model can better evaluate what gets blown up. I envision a room full of 20 year olds fascinated with their tool and completely removed from the reality on the ground. Or worse, enamored with both their tool and the results.
posted by Farce_First at 7:06 PM on April 4 [1 favorite]


I keep thinking about this scene from Patriot Games.

Except 20 seconds per target, and homes, and families, and maybe no Harrison Ford looking conflicted.
posted by credulous at 7:19 PM on April 4


The name of this AI is worth noting: lavender. A quick and sloppy google search reveals the following, which in the context of this computer program uncovers some elite level societal sociopathy.

"Records show lavender has been in use for over 2,500 years. The early uses of lavender were at least as numerous as today's, but they tended to be more of a medicinal nature. Biblical references to lavender are found in the gospel of Luke by the name used at that time, spikenard. Lavender was also used in ancient Egypt for mummification, and the Romans scented their public bathhouses with it. The name lavender is derived from the Latin verb lavare —to wash.

Early household use started with lavender strewn on the floors of castles and sick rooms as a disinfectant and deodorant. It was sold in bunches by street vendors and placed in linen closets as an insecticide to protect linens from moths. Lavender was burned in sickrooms to clean the air.

Lavender's association with washing and bathing has an interesting history. In Medieval and Renaissance Europe, washerwomen were known as "lavenders" because they spread their laundry over lavender bushes to dry for the scent it gave. Royalty is known to have used lavender for the bath, most notably Louis XIV who loved bathing in lavender-scented water.

Lavender is known for its soothing, relaxing qualities and has been used to treat hyperactivity, insomnia, headaches, toothaches, sore joints, and rumbling digestive systems. It was also used to ward off diseases such as the plague and cholera."
posted by nikoniko at 9:36 PM on April 4 [3 favorites]


Hannah Arendt would be crushed right now were she to witness the banality of evil reaching its apotheosis in the "defense" of the state of Israel.
posted by nikoniko at 9:47 PM on April 4 [7 favorites]


> There's an overwhelming indifference at work there, one that fits perfectly with everything I've ever learned about how human beings treat other human beings they've allowed themselves to be persuaded are subhuman.

it's like they're operating within their own zone of interest.[1,2]

also btw...
-Trump's Call for Israel to 'Finish Up' War Alarms Some on the Right[3]
Jared Kushner, who has pursued foreign deals using relationships he built during the Trump administration, said at a Harvard University forum in February that “Gaza’s waterfront property could be very valuable” and that Palestinians should be “moved out” and transported to an area in the Negev Desert in southern Israel that would be bulldozed to accommodate them.[4,5]
-Israel Deploys Expansive Facial Recognition Program in Gaza
-Palmer Luckey says Anduril is working on AI weapons that 'give us the ability to swiftly win any war'[6]
posted by kliuless at 11:14 PM on April 4 [2 favorites]


Is this the war crimes subthread?

Recap: (Haaretz) Israel Created 'Kill Zones' in Gaza. Anyone Who Crosses Into Them Is Shot - The Israeli army says 9,000 terrorists have been killed since the Gaza war began. Defense officials and soldiers, however, tell Haaretz that these are often civilians whose only crime was to cross an invisible line drawn by the IDF
However, a host of reserve and standing army commanders who have talked to Haaretz cast doubt on the claim that all of these were terrorists. They imply that the definition of terrorist is open to a wide range of interpretation. It's quite possible that Palestinians who never held a gun in their lives were elevated to the rank of "terrorist" posthumously, at least by the IDF.

"In practice, a terrorist is anyone the IDF has killed in the areas in which its forces operate," says a reserve officer who has served in Gaza.

(...) He emphasizes that "it's not that we invent bodies, but no one can determine with certainty who is a terrorist and who was hit after entering the combat zone of an IDF force." Indeed, a number of reservists and other soldiers who were in Gaza in recent months point to the ease with which a Palestinian is included in a specific category after his death. It seems that the question is not what he did but where he was killed.

(...) But ultimately, the boundaries of these zones and the exact procedures of operation are subject to interpretation by commanders in that specific area. "As soon as people enter it, mainly adult males, orders are to shoot and kill, even if that person is unarmed," says the reserve officer.


New: (Haaretz) Doctor at Israeli Field Hospital for Detained Gazans: 'We Are All Complicit in Breaking the Law' - 'Two prisoners had their legs amputated due to handcuff injuries,' says a doctor at an Israeli prison facility, who describes deplorable conditions and violations of medical ethics and the law in a letter to ministers, attorney general
"From the first days of the medical facility's operation until today, I have faced serious ethical dilemmas. More than that, I am writing [this letter] to warn you that the facilities' operations do not comply with a single section among those dealing with health in the Incarceration of Unlawful Combatants Law," the doctor writes.

He stressed that all the patients at the hospital set up at Sde Teiman are handcuffed by all four limbs, regardless of how dangerous they are deemed. They are blindfolded and fed through a straw. "Under these conditions, in practice, even young and healthy patients lose weight after a week or two of hospitalization," the physician said. He added that the hospital doesn't receive regular supplies of medical equipment or medicine.

The Israel Defense Forces Spokesperson's Unit said in response that detainees were given enough food for their health needs and had access to the restroom in accordance with their medical condition. If their movement is restricted, it said, they are provided with diapers.

The procedures for care at field hospitals when the patient is handcuffed and blindfolded were issued by the Health Ministry last December. A ministry official said the policy on shackling detainees was instituted after a medical staffer was attacked by a patient. As a rule, detainees at Sde Teiman are handcuffed 24 hours a day.

According to the doctor, more than half of the patients at the facility's hospital are there due to injuries sustained due to constantly being handcuffed during their detention in Israel. The handcuffs, he said, cause serious injuries that "require repeated surgical interventions."

posted by cendawanita at 11:34 PM on April 4 [7 favorites]


Hopefully any doctors who are found to have worked in any of these torture camps and not made ethical complaints like this, will be internationally condemned, denied access to conferences, no longer recognised as medical professionals outside of Israel.

Inside of Israel, of course, I expect only the brave doctors who have made this known to the outside world will suffer any consequences.
posted by Audreynachrome at 12:06 AM on April 5 [5 favorites]


A few more--

(Also shared in the disinfo thread) Current Affairs: Israel’s Propaganda Machine is Filling the Internet with Misinformation:
To begin with, I gave Project T.R.U.T.H. a try. In theory, an automated fact-checking system for claims about Israel and Palestine could be quite useful, since there really is a lot of misinformation floating around these days. On the other hand, an alleged fact-checking website that makes no pretense to being neutral and clearly takes Israel’s side struck me as unlikely to actually evaluate claims fairly. So I devised a simple test: give the site a statement that is straightforwardly true but reflects badly on Israel, and see what it says. This is the prompt I used:

Benjamin Netanyahu has referred to Palestinians as “Amalek,” a reference to a Biblical nation that was exterminated in an act of genocide.

For the record, it’s undisputed that Netanyahu said this, and his inflammatory statement was widely reported in both the Israeli press and internationally. But when I asked Project T.R.U.T.H. to weigh in, I got the following response:

Netanyahu never called Palestinians “Amalek”. Spreading lies won’t change facts. Israel wants peace, repeatedly offering compromises. Focus on Hamas’s recent massacre of Israelis instead of distorting reality. #Truth #IsraelUnderFire #StandWithIsrael

Isn’t that astonishing? But in theory, it could be a simple error. So I pressed “start again” and repeated the exact same query. This is what I got the second time:

Netanyahu never called Palestinians “Amalek”. Using biblical narratives to polarize modern politics ignores the daily terror Israelis face. Hamas, sworn to Israel’s destruction, perpetrates real violence, not metaphors. #TruthMatters #StandWithIsrael #StopHamasTerror


HRW: Israeli Strike Killing 106 Civilians an Apparent War Crime
Between January and March 2024, Human Rights Watch spoke by phone with 16 people about the October 31 attack on the residential Engineers’ Building, and the death of their relatives and others. Human Rights Watch analyzed satellite imagery, 35 photographs, and 45 videos of the attack’s aftermath, as well as other relevant photographs and videos on social media. Human Rights Watch was unable to visit the site because Israeli authorities have blocked virtually all entry into Gaza at its crossings since October 7. Israel has repeatedly denied Human Rights Watch requests to enter Gaza over the last 16 years.

Witnesses said that on October 31, 350 or more people were staying at the Engineers’ Building, just south of the Nuseirat refugee camp. At least 150 were seeking shelter after fleeing their homes elsewhere in Gaza.

Without warning, at about 2:30 p.m., four aerial munitions struck the building within about 10 seconds. The building was completely demolished.

Two brothers said that they rushed out of their nearby homes to look for their two children and their nephew, whom they knew were outside playing football. One of the men said he found his 11-year-old son lying under concrete bars in the rubble: “The back of his head was cracked open, one of his legs seemed barely connected to his body and part of his face was burned, but he seemed to be alive. We freed him in seconds, but he died in the ambulance. We buried him the same day.” All three boys died in the attack.

None of the witnesses interviewed said they had received or heard about any warning from Israeli authorities to evacuate the building before the strike.

Human Rights Watch confirmed the identities of 106 people killed through interviews with relatives of some of the victims, including 34 women, 18 men, and 54 children. The total number of dead is most likely higher. Airwars, a nongovernmental organization that investigates civilian harm in conflict zones, identified in open-source materials 112 names of people killed, including 96 identified by both organizations, as well as 19 other people not by name but through their relationship to other victims in their family.

Human Rights Watch interviewed two people involved in the recovery of bodies from the rubble of the building, who said that on the afternoon of the attack they worked together with others and helped recover about 60 bodies, and that over the next four days, they together recovered about 80 more bodies. A third person said that he helped recover bodies from the rubble for 12 days after the attack. It is possible that other bodies remain under the rubble.

The Israeli authorities have not publicly provided any information about the attack, including the intended target and any precautions to minimize harm to civilians. They have also not responded to a March 13 Human Rights Watch letter summarizing the findings and requesting specific information.


Euro-Med Monitor: Killing starving Palestinians, targeting aid trucks is a deliberate Israeli policy to reinforce famine in the Gaza Strip

That's a backdrop for this clip being reported on Al-Jazeera Arabic, and per Muhammad Shehada: The full incident:
IDF soldiers shot indiscriminately at the starved & desperate crowed that tried to pick up the airdropped aid

The soldiers randomly singled out one starved man & kept shooting until they killed him & left his body to starving dogs

posted by cendawanita at 12:16 AM on April 5 [6 favorites]


nikoniko: even more sadistic and evil than the name "Lavender", by several orders of magnitude, is the other program mentioned in the article, which is called "Where's Daddy?"

Tangentially, i spend a lot of time thinking about Hind Rajab. I'm not sure i believe that any adult citizen of either Israel or the US (and yes, i extremely include myself) actually deserves to be alive, in a world where Hind Rajab died the way she did.
posted by adrienneleigh at 1:55 AM on April 5 [6 favorites]


Palmer Luckey says Anduril is working on AI weapons that 'give us the ability to swiftly win any war'

Jesus fucking christ.

There's a photo of Palmer Luckey accompanying that article. I'd been previously unaware of him and when I saw that photo I immediately pegged him as a total piece of shit.

Obviously you can't read a single photo that way, I said to myself. That's just not sound. But it took only the most minimal possible research to learn that not only is this guy a total piece of shit, he's completely fucking insane.
posted by flabdablet at 2:15 AM on April 5 [8 favorites]


But it took only the most minimal possible research to learn that not only is this guy a total piece of shit, he's completely fucking insane.

Holy fucking shit. What's wrong with keeping the worst things that we've imagined safely within the pages of a horror novel? Imagine if Stephen King had actually brought a few "thought-provoking reminders" into the real world. You know, like a driverless car that goes around running people over. (Okay, maybe not the best example.)
posted by rory at 4:18 AM on April 5 [5 favorites]


Forget "Ma'am, please drop and step away from the lathe", Israel is way past steam and trialling the world's first data-driven CNC lathe of heaven.
posted by Audreynachrome at 4:24 AM on April 5 [1 favorite]


Fwiw... The IDF responds. I would've thought this would be the opportunity not to confirm the practice by slightly rephrasing them in another way, but what do I know.
posted by cendawanita at 6:23 AM on April 5 [6 favorites]


The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.

As NotAYakk raised, I have not yet seen any actual lines on what qualifies as excessive.

Given that, this statement is entirely meaningless. Does the IOF considers 1:20 terrorist (assuming I accept their assesment of that) to civilian casualties excessive? Do they consider 1:100 excessive?

I'm personally uncomfortable with 1:1. As for what ratio might open up the possibility of long-term peace and the ideological defeat of Hamas as a political institution...
posted by Audreynachrome at 6:56 AM on April 5 [4 favorites]


flabdablet: “But just think of the time it saves!”
I regret to inform you that this is more-or-less the argument defense-heads are making. "The only thing worse than having AI targeting might be not having AI targeting."

“Israel's Lavender System, AI Targeting and Battlefield Informatics—Ryan McBeth, 5 April 2024
posted by ob1quixote at 7:52 AM on April 5 [2 favorites]


OOOPSIE:

Top Israeli spy chief exposes his true identity in online security lapse

Exclusive: Yossi Sariel unmasked as head of Unit 8200 and architect of AI strategy after book written under pen name reveals his Google account
posted by rodlymight at 10:09 AM on April 5 [6 favorites]


Wowww from the article:
The security blunder is likely to place further pressure on Sariel, who is said to “live and breathe” intelligence but whose tenure running the IDF’s elite cyber intelligence division has become mired in controversy.

Unit 8200, once revered within Israel and beyond for intelligence capabilities that rivalled those of the UK’s GCHQ, is thought to have built a vast surveillance apparatus to closely monitor the Palestinian territories.

However, it has been criticised over its failure to foresee and prevent Hamas’s deadly 7 October assault last year on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people.

Since the Hamas-led attacks, there have been accusations that Unit 8200’s “technological hubris” came at the expense of more conventional intelligence-gathering techniques


Overall what an appropriate meta piece to illustrate the actual idiocy of a literal murder machinery.
posted by cendawanita at 2:44 PM on April 5 [7 favorites]


Haaretz reported over the weekend that the IDF's estimate that it's killed about 9,000 militants (among the 32,000-plus people it's killed in Gaza since October 7) is based in part on its own designation of "kill zones" inside Gaza.

That is, anyone killed in those zones is ipso facto regarded as a militant. IDF units determine the zones on the ground, and civilians would have no way to know where they are to avoid them.

Aren't logic games grand?
posted by Tasmanian_Kris at 6:29 PM on April 5 [9 favorites]


“Kill Lists In The Age of Artificial Intelligence,” Spencer Ackerman, FOREVER WARS, 03 April 2024
posted by ob1quixote at 6:44 PM on April 5 [5 favorites]


Current day Israel is such a good example of how fascist regimes are usually run by incompetent murderous cretins.

The Unit 8200 guy seems like just another idiot AI obsessed tech bro, but he's managed to murder tens of thousands of civilians through his fake AI target lists because he's incompetent, an idiot, and a monster.

Being a monster might be a requirement to run Israeli intelligence but if he wasn't incompetent, the response to October 7 wouldn't have been such a disaster. If he wasn't a moron, he'd have refused to put his name anywhere near the kill lists that seem likely to earn him a Hague trial in absentia at the very least when Israel loses US diplomatic support sometime in the next few decades.
posted by zymil at 12:38 AM on April 6 [8 favorites]


From that article (“Kill Lists In The Age of Artificial Intelligence,” Spencer Ackerman):

If you've seen Captain America: The Winter Soldier, Lavender is a real-life Project Insight.

Man. I said it first after the first ICJ ruling about Israel meeting the plausible threshold for genocide back in January, but the upcoming Captain America 4 (which is apparently being reshot but the main cast includes Sabra) will be really really awkward, looks like.
posted by cendawanita at 5:45 AM on April 6 [5 favorites]


The idea that the AI is to blame is utterly preposterous. The AI is the EXCUSE, not the cause.

AI does what we tell it to. Israel told theirs to produce huge lists of more or less anyone random it could come up with. They're not interested in "fighting Hamas" they just want excuses to kill anyone they can.

It's convenient for Israel to pretend that it's all the AI's fault, but let's not make the mistake of accepting their premise.
posted by sotonohito at 9:15 AM on April 6 [7 favorites]


I don't think anyone here is. If someone comes into a crowded room with their Automatic Whirling Dervish of a Thousand Blades and presses the On button, I don't think anyone would blame the Dervish for the resulting carnage.
posted by rory at 9:36 AM on April 6 [6 favorites]


Sharing something from the Intercept back in 2022: Documents Reveal Advanced AI Tools Google Is Selling to Israel - Google employees, who have been kept in the dark about the “Nimbus” AI project, have concerns about Israeli human rights abuses.

Because over the weekend: Google Won’t Say Anything About Israel Using Its Photo Software to Create Gaza “Hit List” - Google prohibits using its tech for “immediate harm,” but Israel is harnessing its facial recognition to set up a dragnet of Palestinians.
The program relies on two different facial recognition tools, according to the New York Times: one made by the Israeli contractor Corsight, and the other built into the popular consumer image organization platform offered through Google Photos. An anonymous Israeli official told the Times that Google Photos worked better than any of the alternative facial recognition tech, helping the Israelis make a “hit list” of alleged Hamas fighters who participated in the October 7 attack.

(...) Google employees taking part in the No Tech for Apartheid campaign, a worker-led protest movement against Project Nimbus, called their employer to prevent the Israeli military from using Photos’s facial recognition to prosecute the war in Gaza.

“That the Israeli military is even weaponizing consumer technology like Google Photos, using the included facial recognition to identify Palestinians as part of their surveillance apparatus, indicates that the Israeli military will use any technology made available to them — unless Google takes steps to ensure their products don’t contribute to ethnic cleansing, occupation, and genocide,” the group said in a statement shared with The Intercept. “As Google workers, we demand that the company drop Project Nimbus immediately, and cease all activity that supports the Israeli government and military’s genocidal agenda to decimate Gaza.”

posted by cendawanita at 6:57 PM on April 7 [10 favorites]


Time: Exclusive: Google Workers Revolt Over $1.2 Billion Contract With Israel

There is no evidence Google or Amazon’s technology has been used in killings of civilians. The Google workers say they base their protests on three main sources of concern: the Israeli finance ministry’s 2021 explicit statement that Nimbus would be used by the ministry of defense; the nature of the services likely available to the Israeli government within Google’s cloud; and the apparent inability of Google to monitor what Israel might be doing with its technology. Workers worry that Google’s powerful AI and cloud computing tools could be used for surveillance, military targeting, or other forms of weaponization. Under the terms of the contract, Google and Amazon reportedly cannot prevent particular arms of the government, including the Israeli military, from using their services, and cannot cancel the contract due to public pressure.

Recent reports in the Israeli press indicate that air-strikes are being carried out with the support of an AI targeting system; it is not known which cloud provider, if any, provides the computing infrastructure likely required for such a system to run. Google workers note that for security reasons, tech companies often have very limited insight, if any, into what occurs on the sovereign cloud servers of their government clients. “We don't have a lot of oversight into what cloud customers are doing, for understandable privacy reasons,” says Jackie Kay, a research engineer at Google’s DeepMind AI lab. “But then what assurance do we have that customers aren't abusing this technology for military purposes?”

(...) Two workers for Google DeepMind, the company’s AI division, expressed fears that the lab’s ability to prevent its AI tools being used for military purposes had been eroded, following a restructure last year. When it was acquired by Google in 2014, DeepMind reportedly signed an agreement that said its technology would never be used for military or surveillance purposes. But a series of governance changes ended with DeepMind being bound by the same AI principles that apply to Google at large. Those principles haven’t prevented Google signing lucrative military contracts with the Pentagon and Israel. “While DeepMind may have been unhappy to work on military AI or defense contracts in the past, I do think this isn’t really our decision any more,” said one DeepMind employee who asked not to be named because they were not authorized to speak publicly. “Google DeepMind produces frontier AI models that are deployed via [Google Cloud’s Vertex AI platform] that can then be sold to public-sector and other clients.” One of those clients is Israel.

posted by cendawanita at 9:02 AM on April 13 [5 favorites]


Time: Exclusive: Google Contract Shows Deal With Israel Defense Ministry
Google provides cloud computing services to the Israeli Ministry of Defense, and the tech giant has negotiated deepening its partnership during Israel’s war in Gaza, a company document viewed by TIME shows.

The Israeli Ministry of Defense, according to the document, has its own “landing zone” into Google Cloud—a secure entry point to Google-provided computing infrastructure, which would allow the ministry to store and process data, and access AI services.

The ministry sought consulting assistance from Google to expand its Google Cloud access, seeking to allow “multiple units” to access automation technologies, according to a draft contract dated March 27, 2024. The contract shows Google billing the Israeli Ministry of Defense over $1 million for the consulting service.

The version of the contract viewed by TIME was not signed by Google or the Ministry of Defense. But a March 27 comment on the document, by a Google employee requesting an executable copy of the contract, said the signatures would be “completed offline as it’s an Israel/Nimbus deal.” Google also gave the ministry a 15% discount on the original price of consulting fees as a result of the “Nimbus framework,” the document says.

posted by cendawanita at 9:42 AM on April 13 [3 favorites]


« Older 1974 Super Outbreak   |   How A Sleepy Pennsylvania Town Grew Into America's... Newer »


You are not currently logged in. Log in or create a new account to post comments.