Pages

9 August 2025

Military AI Challenges Human Accountability


Artificial intelligence is no longer confined to code-stained labs or military contractors’ slideshows: it has become a regular presence on modern battlefields. In 2024, as Israeli analysts relied on tools like Gospel and Lavender to generate targeting lists, the Pentagon set out to deploy swarms of autonomous drones through its Replicator Initiative. Targeting algorithms (AI systems analyzing data to identify and prioritize military targets) now compress the decision cycle from days to minutes, sometimes seconds, fundamentally challenging the way law, ethics, and accountability operate in armed conflict. 

It was in response to these realities that, on December 24, 2024, the UN General Assembly adopted Resolution 79/239, affirming that international humanitarian law (IHL) applies “throughout all stages of the life-cycle of artificial intelligence in the military domain” and calling for appropriate safeguards to keep human judgment and control at the heart of military decision-making. But resolutions and declarations, while necessary, do not themselves restrain machines. The responsibility for lawful conduct must remain anchored in human actors: commanders, engineers, and political authorities. 

Algorithms, after all, have no legal personality; they cannot form intent, stand before a court, or bear the weight of tragedy or blame. This is why the real task for military commanders, policymakers, and legal advisers is about translating the timeless obligations of the laws of war into practices and workflows that keep the chain of accountability intact, even as machines accelerate the tempo of armed conflict beyond anything imagined by those who first wrote those rules.

The question, then, is whether states are willing and able to build safeguards so that, even as decisions speed up and control becomes diffuse, a human being remains at the end of every algorithmic chain of action.Ask three officers to describe what counts as AI in uniform, and you will likely hear three different answers. One will mention software that sorts satellite imagery, another will point to a drone that selects its flight path, and a third may describe a logistics program that determines which convoy moves first. All of them are correct because military AI is a broad spectrum of software-enabled capabilities that touch nearly every corner of modern operations.

No comments:

Post a Comment