'Israel Was Defeated': The Collapse of its AI Algorithm-led Wonder-Weapon
Israeli journalist, Yuval Abraham, has written a detailed, multiple-sourced detailing how the Israeli forces have marked tens of thousands of Gazans as suspects for assassination using an AI targeting system.
💬 “Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes”.
💬 “During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing”.
■ ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza (04/04/24)
