The Times Australia
The Times World News

.

Israel accused of using AI to target thousands in Gaza, as killer algorithms outpace international law

  • Written by Natasha Karner, PhD Candidate, International Studies, RMIT University

The Israeli army used a new artificial intelligence (AI) system to generate lists of tens of thousands of human targets for potential airstrikes in Gaza, according to a report[1] published last week. The report comes from the nonprofit outlet +972 Magazine, which is run by Israeli and Palestinian journalists.

The report cites interviews with six unnamed sources in Israeli intelligence. The sources claim the system, known as Lavender, was used with other AI systems to target and assassinate suspected militants – many in their own homes – causing large numbers of civilian casualties.

According to another report in the Guardian, based on the same sources as the +972 report, one intelligence officer said[2] the system “made it easier” to carry out large numbers of strikes, because “the machine did it coldly”.

As militaries around the world race to use AI, these reports show us what it may look like: machine-speed warfare with limited accuracy and little human oversight, with a high cost for civilians.

Military AI in Gaza is not new

The Israeli Defence Force denies many of the claims in these reports. In a statement to the Guardian[3], it said it “does not use an artificial intelligence system that identifies terrorist operatives”. It said Lavender is not an AI system but “simply a database whose purpose is to cross-reference intelligence sources”.

But in 2021, the Jerusalem Post reported an intelligence official saying Israel had just won its first “AI war[4]” – an earlier conflict with Hamas – using a number of machine learning systems to sift through data and produce targets. In the same year a book called The Human–Machine Team[5], which outlined a vision of AI-powered warfare, was published under a pseudonym by an author recently revealed[6] to be the head of a key Israeli clandestine intelligence unit.

Last year, another +972 report[7] said Israel also uses an AI system called Habsora to identify potential militant buildings and facilities to bomb. According the report, Habsora generates targets “almost automatically”, and one former intelligence officer described it as “a mass assassination factory”.

Read more: Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?[8]

The recent +972 report[9] also claims a third system, called Where’s Daddy?, monitors targets identified by Lavender and alerts the military when they return home, often to their family.

Death by algorithm

Several countries are turning to algorithms in search of a military edge. The US military’s Project Maven supplies AI targeting[10] that has been used in the Middle East and Ukraine. China too is rushing to develop AI systems[11] to analyse data, select targets, and aid in decision-making.

Proponents of military AI argue[12] it will enable faster decision-making, greater accuracy and reduced casualties in warfare.

Yet last year, Middle East Eye reported[13] an Israeli intelligence office said having a human review every AI-generated target in Gaza was “not feasible at all”. Another source told +972[14] they personally “would invest 20 seconds for each target” being merely a “rubber stamp” of approval.

The Israeli Defence Force response to the most recent report says[15] “analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law”.

A satellite photo showing a built-up area in which. many buildings are damaged and destroyed.
Israel’s bombing campaign has taken a heavy toll on Gaza. Maxar Technologies / AAP[16]

As for accuracy, the latest +972 report claims[17] Lavender automates the process of identification and cross-checking to ensure a potential target is a senior Hamas military figure. According to the report, Lavender loosened the targeting criteria to include lower-ranking personnel and weaker standards of evidence, and made errors in “approximately 10% of cases”.

The report also claims one Israeli intelligence officer said that due to the Where’s Daddy? system, targets would be bombed in their homes “without hesitation, as a first option”, leading to civilian casualties. The Israeli army says[18] it “outright rejects the claim regarding any policy to kill tens of thousands of people in their homes”.

Rules for military AI?

As military use of AI becomes more common, ethical, moral and legal concerns have largely been an afterthought. There are so far no clear, universally accepted or legally binding rules about military AI.

The United Nations has been discussing “lethal autonomous weapons systems” for more than ten years. These are devices that can make targeting and firing decisions without human input, sometimes known as “killer robots”. Last year saw some progress.

Read more: US military plans to unleash thousands of autonomous war robots over next two years[19]

The UN General Assembly voted in favour of a new draft resolution to ensure[20] algorithms “must not be in full control of decisions involving killing”. Last October, the US also released[21] a declaration on the responsible military use of AI and autonomy, which has since been endorsed by 50 other states. The first summit[22] on the responsible use of military AI was held last year, too, co-hosted by the Netherlands and the Republic of Korea.

Overall, international rules over the use of military AI are struggling to keep pace with the fervour of states and arms companies for high-tech, AI-enabled warfare.

Facing the ‘unknown’

Some Israeli startups that make AI-enabled products are reportedly making a selling point[23] of their use in Gaza. Yet reporting on the use of AI systems in Gaza suggests how far short AI falls of the dream of precision warfare, instead creating serious humanitarian harms.

The industrial scale at which AI systems like Lavender can generate targets also effectively “displaces humans by default[24]” in decision-making.

The willingness to accept AI suggestions with barely any human scrutiny also widens the scope of potential targets, inflicting greater harm.

Setting a precedent

The reports on Lavender and Habsora show us what current military AI is already capable of doing. Future risks of military AI may increase even further.

Chinese military analyst Chen Hanghui has envisioned a future “battlefield singularity[25]”, for example, in which machines make decisions and take actions at a pace too fast for a human to follow. In this scenario, we are left as little more than spectators or casualties.

A study published[26] earlier this year sounded another warning note. US researchers carried out an experiment in which large language models such as GPT-4 played the role of nations in a wargaming exercise. The models almost inevitably became trapped in arms races and escalated conflict in unpredictable ways, including using nuclear weapons.

The way the world reacts to current uses of military AI – like we are seeing in Gaza – is likely to set a precedent for the future development and use of the technology.

Read more: The defence review fails to address the third revolution in warfare: artificial intelligence[27]

References

  1. ^ report (www.972mag.com)
  2. ^ said (www.theguardian.com)
  3. ^ statement to the Guardian (www.idf.il)
  4. ^ AI war (www.jpost.com)
  5. ^ The Human–Machine Team (www.amazon.com.au)
  6. ^ recently revealed (www.theguardian.com)
  7. ^ another +972 report (www.972mag.com)
  8. ^ Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war? (theconversation.com)
  9. ^ recent +972 report (www.972mag.com)
  10. ^ AI targeting (www.bloomberg.com)
  11. ^ develop AI systems (cepa.org)
  12. ^ argue (www.defense.gov)
  13. ^ reported (www.middleeasteye.net)
  14. ^ told +972 (www.972mag.com)
  15. ^ says (www.idf.il)
  16. ^ Maxar Technologies / AAP (photos.aap.com.au)
  17. ^ +972 report claims (www.972mag.com)
  18. ^ says (www.idf.il)
  19. ^ US military plans to unleash thousands of autonomous war robots over next two years (theconversation.com)
  20. ^ to ensure (press.un.org)
  21. ^ released (www.state.gov)
  22. ^ first summit (reaim2023.org)
  23. ^ making a selling point (asia.nikkei.com)
  24. ^ displaces humans by default (opiniojuris.org)
  25. ^ battlefield singularity (s3.amazonaws.com)
  26. ^ study published (arxiv.org)
  27. ^ The defence review fails to address the third revolution in warfare: artificial intelligence (theconversation.com)

Read more https://theconversation.com/israel-accused-of-using-ai-to-target-thousands-in-gaza-as-killer-algorithms-outpace-international-law-227453

Times Magazine

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Times Features

Tricia Paoluccio designer to the stars

The Case for Nuturing Creativity in the Classroom, and in our Lives I am an actress and an artist who has had the privilege of sharing my work across many countries, touring my ...

Duke of Dural to Get Rooftop Bar as New Owners Invest in Venue Upgrade

The Duke of Dural, in Sydney’s north-west, is set for a major uplift under new ownership, following its acquisition by hospitality group Good Beer Company this week. Led by resp...

Prefab’s Second Life: Why Australia’s Backyard Boom Needs a Circular Makeover

The humble granny flat is being reimagined not just as a fix for housing shortages, but as a cornerstone of circular, factory-built architecture. But are our systems ready to s...

Melbourne’s Burglary Boom: Break-Ins Surge Nearly 25%

Victorian homeowners are being warned to act now, as rising break-ins and falling arrest rates paint a worrying picture for suburban safety. Melbourne residents are facing an ...

Exploring the Curriculum at a Modern Junior School in Melbourne

Key Highlights The curriculum at junior schools emphasises whole-person development, catering to children’s physical, emotional, and intellectual needs. It ensures early year...

Distressed by all the bad news? Here’s how to stay informed but still look after yourself

If you’re feeling like the news is particularly bad at the moment, you’re not alone. But many of us can’t look away – and don’t want to. Engaging with news can help us make ...