Thursday, September 12, 2024

Update on AI and IHL in Gaza

Liam Stack who spent many years as a Middle East correspondent based in Cairo reports from Tel Aviv for the New York Times that Israel Defends Strike on School Compound as Condemnation Mounts.


The article on AI Distinction Proportionality IHL in Gaza (https://tinyurl.com/ye27jwm8) was

posted Tue, May 28, 2024. It argued that the Military Forces of Israel have access to very sophisticated AI supported tools, similar to those of the United States, that not only permit, but require, the strictest application of International Humanitarian Law (IHL) in the areas of distinction and proportionality. Failure to do so exposes Israeli authorities to war crime accusations and further isolation from traditional allies in Western Democracies.


Simon Frankel Pratt, a lecturer in political science at the School of Social and Political Sciences, University of Melbourne, posted an argument to FP, Foreign Policy Magazine. His article is an expert's point of view on a current article entitled “When AI Decides Who Lives and Dies”. The Israeli military’s algorithmic targeting has created dangerous new precedents. Investigative journalism published in April by Israeli media outlet Local Call (and its English version, +972 Magazine) shows that the Israeli military has established a mass assassination program of unprecedented size, blending algorithmic targeting with a high tolerance for bystander deaths and injuries.


Local Call and +972 Magazine have shown that the IDF may be criminally negligent in its willingness to strike targets when the risk of bystanders dying is very high, but because the targets selected by Lavender are ostensibly combatants, the IDF’s airstrikes are not intended to exterminate a civilian population. They have followed the so-called operational logic of targeted killing even if their execution has resembled saturation bombing in its effects.


Although Israel often presents the IDF as being in exemplary conformance to liberal and Western norms, the way that the IDF has used AI in Gaza, according to the Local Call and +972, is in stark contrast to those same norms. In U.S. military doctrine, all strikes must strive to keep bystander deaths below the determined “non-combatant casualty cut-off value” (NCV).



NCVs for most U.S. operations have been very low, and historically, so have Israel’s—at least when it comes to targeted killing. For example, when Hamas commander Salah Shehadeh was killed along with 14 others in an Israeli airstrike in 2002, then-IDF Chief of Staff Moshe Yaalon said that he would not have allowed the operation to happen if he’d known it would kill that many others. In interviews over the years, other Israeli officials involved in the operation similarly stated that the high number of bystander deaths was a major error. (Stack, n.d.)


Local Call and +972 revealed that, by contrast, the assassination of Hamas battalion commander Wissam Farhat during the current Israel-Hamas war had an NCV of more than 100 people—and that the IDF anticipated that it would kill around that many.


An Israeli intelligence source interviewed by +972 Magazine claimed that time constraints made it impossible to “incriminate” every target, which raised the IDF’s tolerance for the margin of statistical error from using AI-powered target recommendation systems—as well as its tolerance for the associated “collateral damage.”


This matters to experts in international law and military ethics because of the doctrine of double effect, which permits foreseeable but unintended harms if the intended act does not depend on those harms occurring, such as in the case of an airstrike against a legitimate target that would happen whether or not there were bystanders. But in the case of the Israel-Hamas war, most lawyers and ethicists—and apparently some number of IDF officers—see these strikes as failing to meet any reasonable standard of proportionality while stretching the notion of discrimination beyond reasonable interpretations. In other words, they may still be war crimes. (Stack, n.d.)


The Military Forces of Israel have access to very sophisticated AI supported tools, similar to those of the United States, that not only permit, but require, the strictest application of International Humanitarian Law in the areas of distinction and proportionality. Failure to do so exposes Israeli authorities to war crime accusations and further isolation from traditional allies in Western Democracies.



References

Stack, L. (n.d.). Israel Defends Strike on School Compound as Condemnation Mounts. The New York Times - Breaking News, US News, World News and Videos. Retrieved September 12, 2024, from https://www.nytimes.com/live/2024/09/12/world/israel-hamas-gaza-war 



No comments:

Post a Comment