- by MAJDAL SHAMS
- 07 28, 2024
Loading
FOR OVERIDFAI a decade military experts, lawyers and ethicists have grappled with the question of how to control lethal autonomous weapon systems, sometimes pejoratively called killer robots. One answer was to keep a “man in the loop”—to ensure that a human always approved each decision to use lethal force. But in 2016 Heather Roff and Richard Moyes, then writing for Article 36, a non-profit focused on the issue, cautioned that a person “simply pressing a ‘fire’ button in response to indications from a computer, without cognitive clarity or awareness”, does not meaningfully qualify as “human control”.That nightmarish vision of war with humans ostensibly in control but shorn of real understanding of their actions, killing in rote fashion, seems to have come to pass in Gaza. This is the message of two reports published by , a left-wing Israeli news outlet, the most recent one on April 3rd. The Israel Defence Forces () have reportedly developed artificial-intelligence () tools known as “The Gospel” and “Lavender” to “mark” suspected operatives of Hamas and Palestinian Islamic Jihad, two militant groups, as targets for bombing, according to Israeli officers familiar with the systems.