Featured

Israel disputes it has Lavender, AI program for targeted killing that tolerates civilian casualties

Israel is aggressively disputing assertions that it is using an artificial intelligence system for a targeted killing program that tolerates civilian deaths as acceptable collateral damage in its war against Hamas. 

Explosive allegations that Israel has a secret AI-powered killing machine called “Lavender” spread on Wednesday in a pair of news reports citing anonymous intelligence sources involved in the Hamas-Israel war.  

The Israel Defense Forces said Wednesday evening that it does not use AI to designate people as targets for military strikes. 



“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF said in a statement published to its website. “Information systems are merely tools for analysts in the target identification process.”

The IDF said its analysts are ordered to conduct independent examinations that verify the targets for strikes are acceptable under international law and restrictions imposed by the IDF. 

Reports from +972 Magazine and the Guardian portrayed Israel as allowing the AI system to direct human analysts’ judgment in a rush to fight back against Hamas soon after the October 7 attack on Israel. 

+972 Magazine’s report from writer Yuval Abraham said the Israeli army authorized the killing of more than 100 civilians in pursuit of targets deemed to be a senior Hamas official. 

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based,” Mr. Abraham wrote. “One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing.”

Mr. Abraham said his reporting relied on six unnamed Israeli intelligence officers with firsthand involvement in the use of AI for assassinations. The Guardian reported it received the six officers’ accounts provided to Mr. Abraham before +972 Magazine’s publication. 

“Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants,” the Guardian said. “Attacks on such targets were typically carried out using unguided munitions known as ‘dumb bombs,’ the sources said, destroying entire homes and killing all their occupants.”

The IDF disputed these assertions on the social media platform X on Wednesday evening. 

Lt. Col Peter Werner, IDF spokesman, decried “poor media ethics” in a series of posts disputing the reporting as false.

“NO Hamas individual was targeted with an expected 100 civilian casualties,” Lt. Col. Werner posted. “NO Hamas individual was automatically approved for attack with an expected 15-20 casualties.”

Full details on the use of AI in the Hamas-Israel war may not emerge anytime soon but the IDF has acknowledged some of its AI capabilities on its website. 

A November 2023 post on IDF’s website said it has a target factory using an AI system dubbed the “Gospel.”

“This is a system that allows the use of automatic tools to produce targets at a fast pace, and works by improving accurate and high-quality intelligence material according to the requirement,” reads an English-language translation of the IDF’s website. “With the help of artificial intelligence, and through the rapid and automatic extraction of updated intelligence – it produces a recommendation for the researcher, with the goal being that there will be a complete match between the machine’s recommendation and the identification performed by a person.”

Mr. Abraham wrote Wednesday that while the Gospel marked buildings for attack, Lavender marked people for inclusion on a kill list. 

Mr. Abraham is not a dispassionate observer and has reported that the bombing has taken a personal toll on him. In a November 2023 report for +972 Magazine, he wrote that Israel bombed the home of a close friend of his in the previous month.

Other militaries use AI tools for targeting purposes. For example, U.S. Central Command told Bloomberg that AI tools helped narrow down its list of potential targets for strikes inside Iraq and Syria earlier this year.

Source link