5 May 2026

AI nuclear decision making has a data problem

Ulysse Richard, Yorgo El Moubayed

Political and military leaders around the world are increasingly turning to artificial intelligence (AI) to improve the pace and quality of their decisions, including in scenarios that could lead to nuclear war. Their technical staff should be quick to raise a structural constraint: AI-enabled decision-support systems are only as good as the computer models they rely on, and, crucially, the data available to train those models. But gathering data for scenarios that must be avoided at all costs is no small task. Even if attempting to leverage AI systems for nuclear decision making is a worthwhile pursuit, who can teach the machine about nuclear war?

Not history. Only two nuclear weapons have ever been used in war. Serious crises involving nuclear weapons number perhaps several dozen over eight decades. There is near-zero real-world data from which a machine could learn to manage a nuclear standoff. And yet AI is already being woven into the systems that filter intelligence, characterize attacks, and shape the information reaching the people responsible for making launch decisions.

No comments: