On April 3, Israeli journalist Yuval Abraham published an extensive report in the online news outlet 972 on Israel’s use of an automated data-curation system for identifying targets in Gaza, nicknamed “Lavender.” The report detailed how Israel delegated the formulation of “kill lists” to this artificial intelligence program, including for lower-level “suspected” Hamas militants at home with their families, with human soldiers largely rubber-stamping the strikes.
Against the backdrop of the widespread destruction and high civilian toll of Israel’s invasion of Gaza, the report immediately provoked concern among officials in the U.S. and elsewhere. White House spokesperson John Kirby stated the U.S. was reviewing the report closely. United Nations Secretary-General Antonio Guterres declared he was deeply troubled, stating, “No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms.” The report appeared on the heels of an incident in which seven Western aid workers in Gaza—including several former British military personnel—were targeted from the air, prompting hundreds of prominent British lawyers and judges to urge their government to suspend arms sales to Israel to avert “complicity in grave breaches of international law.”
Activists and entrepreneurs concerned with the increasing use of weaponized AI in armed conflict have been especially seized of Abraham’s reporting. The Campaign to Stop Killer Robots, which since 2012 has focused on developing a red-line treaty ban against the delegation of lethal targeting decisions to fully autonomous weapons systems, issued a statement immediately upon its publication.