How do "death AIs" work?
The war in the Gaza Strip, perpetrated by Israel against the Palestinians, has been described as genocide by several experts, including an independent UN investigation. The manner in which attacks are carried out is distinguished by the use of numerous surveillance systems aided by AI that automate the classification of potential targets. The programs Lavender, The Gospel, and Where’s Daddy operate jointly to identify what Israel considers a threat and propose interventions, anticipating the death of a significant number of civilians. These technologies already form part of institutional activity as a "Genocidal Surveillance Architecture" (GSA), a term coined by researchers Mais Qandeel of the University of Galway (Ireland) and Özgün Erdener Topak of York University (Canada). They published the article "Genocidal Surveillance Set in Palestine: A Socio-Legal Analysis" in the Journal of Genocide Research, where they delve into the study of these systems.
The term "genocidal surveillance architecture" stems from critical surveillance studies. The difference lies in that it refers to a project coordinated by a State that uses all its infrastructural power to unify systems aimed at destroying a racial or ethnic group. It would be a new phase of organized brutality, characterized by being systematic, ideologically articulated, technologically effective, and administered by the State. It is a dangerous form of genocidal governance, rooted in Israel's colonial project, which sets a precedent for future conflicts, according to the authors.
The Lavender System is one of the engines of the GSA. Its function is to analyze mass surveillance data (social networks, phones, espionage, etc.) to assign a score to every individual in the Gaza Strip, indicating the probability that they are militants and potential targets of the war. It is believed that the program has identified approximately 37,000 Palestinians and continues to operate, indicating an endless war. The "Where’s Daddy" operation, in turn, tracks marked individuals to their homes to attack them when they are with their families. The deaths that may result from the attacks—euphemistically called collateral damage—can reach twenty people for less important targets and up to one hundred for targets considered very important, such as high-ranking military personnel.
The Gospel, on the other hand, focuses on generating infrastructure targets. It identifies buildings and structures associated with potential suspects and operates with an emphasis on maximizing damage, not necessarily precision. Therefore, according to data collected in the investigation, entire residential buildings, hospitals, or schools are destroyed. "Before, in Gaza we created fifty targets a year. And here the machine produced one hundred targets in a single day," according to the former head of the Israeli Armed Forces, Aviv Kochavi.
The automated use of these tools entails a distancing from human decision-making and accountability for the consequences of the war. Oversight of decisions is minimal or merely bureaucratic. Reports from Israeli soldiers indicate that it would take less than twenty seconds to authorize a bombing generated by AI. One of the few criteria for human verification is whether the targets are men. It is a convergence of two forces: decades of accumulated territorial surveillance combined with a state-of-the-art artificial intelligence layer.
In this context, large technology companies play a fundamental role. These companies have provided unlimited infrastructure with AI capabilities, such as facial recognition, object tracking, and sentiment analysis. Palantir Technologies has long been providing predictive police surveillance programs for Israel. Google and Amazon have been collaborating with the Israeli government on Project Nimbus since 2021. Microsoft is also integrated into cloud services, monitoring, and data processing that make up this surveillance infrastructure.^1,2,3^
The study notes that the GSA represents a new stage in the modern process of cumulative bureaucratization of coercion. AI systems have made mass murder routine, turning it into a cold, statistical, and data-based process, as described by an Israeli soldier: "The machine does it coldly. And it makes everything easier." This means that the new surveillance system configures genocidal action, using target generation as a technique of power.
There is a historical context of Israeli colonial settlement in Palestine that simultaneously promotes this project and uses the war situation to drive and enhance it from the perspective of military efficiency. All this sets a dangerous precedent, the authors emphasize. The GSA would represent a model of genocide for the 21st century. If state and corporate perpetrators are not held accountable, there is a high risk that this model will be exported and replicated in other contexts, using the most advanced surveillance technologies to commit atrocities like those observed in the Gaza Strip.^4,5,6^
The type of analysis in the article falls within the axis of research that OplanoB classifies as "surveillance capitalism in the Global South," which focuses on partnerships between large technology companies and nation-states to explain the reasons for the success of this phenomenon outside of capitalism which, in this particular case, uses war to obtain profits.
6 Citations
AP exposes Big Tech AI systems' direct role in warfare amid Israel's war in Gaza - Business and Human Rights Centre
https://www.business-humanrights.org/en/latest-news/ap-exposes-big-tech-ai-systems-direct-role-in-warfare-amid-israels-war-in-gaza/
AI for War: Big Tech Empowering Israel’s Crimes and Occupation | Al-Shabaka
https://al-shabaka.org/briefs/ai-for-war-big-tech-empowering-israels-crimes-and-occupation/
AI in Israel's war on Gaza
https://www.accessnow.org/publication/artificial-genocidal-intelligence-israel-gaza/
Gaza: Israel’s AI Human Laboratory – The Cairo Review of Global Affairs
https://www.thecairoreview.com/essays/gaza-israels-ai-human-laboratory/
‘The lesson from Gaza is clear: when AI-powered machines control who lives, human rights die’ - CIVICUS LENS
https://lens.civicus.org/interview/the-lesson-from-gaza-is-clear-when-ai-powered-machines-control-who-lives-human-rights-die/
The Genocide Will Be Automated—Israel, AI and the Future of War - MERIP
https://www.merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/