By Gladys Koh (26A01B)
For all of its devastation, war has a way of forcing invention.
A century ago, when the first tanks rolled over the mire of the First World War, soldiers recoiled with horror. They called them “metal monsters”—unfamiliar machines grinding through the trenches like something half alive. While some soldiers fled amidst a spray of successive gunfire, others stood frozen, trapped in the shadow of impending doom.

But the scenes before our eyes have shifted today.
The mud and trenches of the Western Front have given way to a quieter, more abstract theatre of war. Strain your ears, and beyond the cracking explosions and roar of artillery, there is something else—the faint mechanical buzz of unmanned drones overhead, the low hum of a system in motion.
A new player has entered the field.
The rise of AI warfare
In recent weeks, disclosures have brought to light the US’s use of Anthropic’s AI system in autonomous weapons systems. This follows months after the development of Israel’s Lavender AI, designed to accelerate the “kill chain”—the sequential process through which targets are identified, selected and eliminated.
The promise is familiar: speed, accuracy, and reduced human error. The acceleration of data processing and decision-making timelines marks a sharp departure from earlier forms of labour-intensive human input warfare.
Historical theories suggest that such a transformation would fundamentally alter the very nature of conflict. Prussian military analyst Carl von Clausewitz described warfare as a “realm of uncertainty” shrouded in friction, producing what he called a “fog of war”—a condition of unpredictability that impedes decision making. Today, AI ostensibly promises to lift this fog.
Yet such clarity may prove illusory. In compressing decision-making into seconds, AI risks not eliminating uncertainty but merely displacing it. As judgment is embedded within opaque systems, the reasoning behind acts of violence becomes harder to discern and easier to obscure.
In the ongoing Israel-Hamas conflict, sources have claimed that the Israel army almost completely relied on Lavender, which had allegedly identified as many as 37,000 Palestinians as suspected militants for possible air strikes. Despite boasting only a 90% accuracy rate generating “kill lists”, “sweeping approval” was granted for soldiers to follow them.
It is increasingly evident that violence has become mediated by machines. Human presence in typical kill chains has dwindled and been rendered almost imperceptible. But when kill lists are simply entrusted to AI, it begs a pressing question: who assumes accountability for the blood shed?

AI is not neutral. Neither is it infallible.
The data it is trained on is shaped by human biases and priorities, which in turn is reproduced and amplified. Transcending a mere role of assessing risk, the machine actively decides who is deemed suspicious enough to be acted upon. What therefore results is a form of moral outsourcing, whereby responsibility is blurred by the guise of objectivity.
War at a Distance
Limited oversight further compounds this danger. With minimal human verification of these “kill lists,” officers may gradually defer to AI outputs, treating them as authoritative and impartial by default. This reliance inadvertently enables soldiers to bypass moral friction—and in doing so, widens the distance between violence and a sense of culpability. Even when executing duties deemed necessary or paramount, delegating lethal decisions to AI reduces the sanctity of life to procedural compliance.
And yet this distance does not, and cannot, change the nature of violence. Violence does not become less violent because it is automated. Replacing human presence with algorithms does not lessen the moral weight of a life. A comforting sense of detachment does not grant absolution. Ultimately, what we surrender is not just judgment, but the ability to recognise humanity at all.
“[Lavender] gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant.”
+972 Magazine
This erosion of judgment is already visible in the systematic targeting of individuals. In the name of efficiency, accounts suggest that civilians bearing no link to Hamas’ military wing were nonetheless spotlighted as another faceless entry on the kill list. Analytical distinctions, once prized as the epitome of human judgement, have now been supplanted by hardlined code that assigns every citizen a risk score. What is heralded as a quicker, more accurate solution instead forces identity into abstract categories. In the end, such a paradigm reduces all civilians to nothing more than calculable variables.
When the central tenet of strife is religious intolerance and irreconcilable territorial claims, such abstraction has severe ramifications. On a human level, grassroots avenues for reconciliation seem to slip further out of grasp. Lavender collapses a myriad of identities into that of “terrorist”, ossifying the archaic binary of the Self and Other. Denied the agency to reclaim their own narrative, those reduced to abstract data points become further dehumanised. Prejudices are hardened, animosities are entrenched. To this end, reliance on this system only serves to fuel a self-sustaining, vicious cycle in which conflict occurs inexorably.

War as it was, War as it will be
The era of autonomous warfare is already on our horizons. Propelled across the fields in Ukraine and Iran, unmanned aerial vehicles intercept oncoming missiles. In the water, British minesweeper drones scope the Strait of Hormuz. States will not resist what promises an advantage. But as we continue to delegate violence to a machine, human presence on the battlefield will increasingly fade into obscurity.
The Greek historian Herodotus once observed, “In peace, sons bury their fathers; in war, fathers bury their sons”. AI does not end war’s brutality, nor does it alter its unnaturalness. It only risks entrenching a world where violence is easier to wage yet far more difficult to humanise. What occurred a hundred years ago continues to repeat itself, albeit with shinier machines and a changed landscape.
The war to end all wars marred the world with the blood of millions, marking our first descent into the mechanisation of death. Yet, war then was still bound to the human body, remaining legible in its brutality. The grime of the trenches still forced humanity to confront its cost. Today, that confrontation grows ever more distant.
Continuing to blindly rely on technology risks surrendering our moral compass and further devaluing human life. A nation may win the war with superior machines, yet it might ultimately lose the battle of humanity.
And through it all, the poppies remember.


