Pages

29 October 2019

War Is Not Over

By Tanisha M. Fazal and Paul Poast 

The political turmoil of recent years has largely disabused us of the notion that the world has reached some sort of utopian “end of history.” And yet it can still seem that ours is an unprecedented era of peace and progress. On the whole, humans today are living safer and more prosperous lives than their ancestors did. They suffer less cruelty and arbitrary violence. Above all, they seem far less likely to go to war. The incidence of war has been decreasing steadily, a growing consensus holds, with war between great powers becoming all but unthinkable and all types of war becoming more and more rare.

This optimistic narrative has influential backers in academia and politics. At the start of this decade, the Harvard psychologist Steven Pinker devoted a voluminous book, The Better Angels of Our Nature, to the decrease of war and violence in modern times. Statistic after statistic pointed to the same conclusion: looked at from a high enough vantage point, violence is in decline after centuries of carnage, reshaping every aspect of our lives “from the waging of wars to the spanking of children.”


Pinker is not alone. “Our international order,” U.S. President Barack Obama told the United Nations in 2016, “has been so successful that we take it as a given that great powers no longer fight world wars, that the end of the Cold War lifted the shadow of nuclear Armageddon, that the battlefields of Europe have been replaced by peaceful union.” At the time of this writing, even the Syrian civil war is winding down. There have been talks to end the nearly two decades of war in Afghanistan. A landmark prisoner swap between Russia and Ukraine has revived hopes of a peace agreement between the two. The better angels of our nature seem to be winning.

If this sounds too good to be true, it probably is. Such optimism is built on shaky foundations. The idea that humanity is past the era of war is based on flawed measures of war and peace; if anything, the right indicators point to the worrying opposite conclusion. And the anarchic nature of international politics means that the possibility of another major conflagration is ever present.
BODY COUNTS

The notion that war is in terminal decline is based, at its core, on two insights. First, far fewer people die in battle nowadays than in the past, both in absolute terms and as a percentage of the world population. Experts at the Peace Research Institute Oslo pointed this out in 2005, but it was Pinker who introduced the point to a wider audience in his 2011 book. Reviewing centuries of statistics on war fatalities, he argued that not only is war between states on the decline; so are civil wars, genocides, and terrorism. He attributes this fall to the rise of democracy, trade, and a general belief that war has become illegitimate.

Then there is the fact that there has not been a world war since 1945. “The world is now in the endgame of a five-century-long trajectory toward permanent peace and prosperity,” the political scientist Michael Mousseau wrote in an article in International Security earlier this year. The political scientist Joshua Goldstein and the legal scholars Oona Hathaway and Scott Shapiro have also argued as much, tying the decline of interstate war and conquest to the expansion of market economies, the advent of peacekeeping, and international agreements outlawing wars of aggression.

Taken together, these two points—fewer and fewer battle deaths and no more continent-spanning wars—form a picture of a world increasingly at peace. Unfortunately, both rest on faulty statistics and distort our understanding of what counts as war.

War has not become any less prevalent; it has only become less lethal.

To begin with, relying on body counts to determine if armed conflict is decreasing is highly problematic. Dramatic improvements in military medicine have lowered the risk of dying in battle by leaps and bounds, even in high-intensity fighting. For centuries, the ratio of those wounded to those killed in battle held steady at three to one; the wounded-to-killed ratio for the U.S. military today is closer to ten to one. Many other militaries have seen similar increases, meaning that today’s soldiers are far more likely to wind up injured than dead. That historical trend undermines the validity of most existing counts of war and, by extension, belies the argument that war has become a rare occurrence. Although reliable statistics on the war wounded for all countries at war are hard to come by, our best projections cut by half the decline in war casualties that Pinker has posited. What’s more, to focus only on the dead means ignoring war’s massive costs both for the wounded themselves and for the societies that have to care for them.

Consider one of the most widely used databases of armed conflict: that of the Correlates of War project. Since its founding in the 1960s, COW has required that to be considered a war, a conflict must generate a minimum of 1,000 battle-related fatalities among all the organized armed actors involved. Over the two centuries of war that COW covers, however, medical advances have drastically changed who lives and who dies in battle. Paintings of wounded military personnel being carried away on stretchers have given way to photographs of medevac helicopters that can transfer the wounded to a medical facility in under one hour—the “golden hour,” when the chances of survival are the highest. Once the wounded are on the operating table, antibiotics, antiseptics, blood typing, and the ability to transfuse patients all make surgeries far more likely to be successful today. Personal protective equipment has evolved, too. In the early nineteenth century, soldiers wore dress uniforms that were often cumbersome without affording any protection against gunshots or artillery. World War I saw the first proper helmets; flak jackets became common in the Vietnam War. Today, soldiers wear helmets that act as shields and radio sets in one. Over the course of the wars in Afghanistan and Iraq alone, medical improvements have decreased the number of deaths from improvised explosive devices and small-arms fire. As a result of these changes, many contemporary wars listed in COW’s database appear less intense. Some might not make it past COW’s fatality threshold and would therefore be excluded.

Better sanitation has left its mark, too, especially improvements in cleanliness, food distribution, and water purification. During the American Civil War, physicians often failed to wash their hands and instruments between patients. Today’s doctors know about germs and proper hygiene. A six-week campaign during the Spanish-American War of 1898 led to just 293 casualties, fatal and nonfatal, from fighting but a staggering 3,681 from various illnesses. This was no outlier. In the Russo-Turkish War of 1877–78, nearly 80 percent of the deaths were caused by disease. Because counting and categorizing casualties in a war is notoriously difficult, these statistics should be taken with a grain of salt, but they illustrate a broader point: as sanitation has improved, so has the survivability of war. The health of soldiers also skews battle deaths, since ill soldiers are more likely to die in battle than healthier soldiers. And military units fighting at their full complement will have higher survival rates than those decimated by disease.


A U.S. Army field hospital in France, 1918Wikimedia Commons

Moreover, some of the advances that have made modern war less deadly, although no less violent, are more reversible than they seem. Many depend on the ability to quickly fly the wounded to a hospital. For the U.S. military, doing so was possible in the asymmetric conflicts against insurgents in Afghanistan and Iraq, where the United States had almost total control of the skies. In a great-power war, however, airpower would be distributed much more equally, limiting both sides’ ability to evacuate their wounded via air. Even a conflict between the United States and North Korea would severely test U.S. medevac capabilities, shifting more casualties from the “nonfatal” to the “fatal” column. And a great-power war could involve chemical, biological, radiological, or nuclear weapons, which have been used so rarely that there are no good medical models for treating their victims.

Skeptics may point out that most wars since World War II have been civil wars, whose parties might not actually have had access to sophisticated medical facilities and procedures—meaning that the decline in casualties is more real than artifice. Although this is true for many rebel groups, civil wars also typically involve state militaries, which do invest in modern military medicine. And the proliferation of aid and development organizations since 1945 has made many of these advances available, at least to some extent, to civilian populations and insurgents. A foundational principle of humanitarian organizations such as the International Committee of the Red Cross is impartiality, meaning that they do not discriminate between civilians and combatants in giving aid. In addition, rebel groups often have external supporters who provide them with casualty-reducing equipment. (The United Kingdom, for example, shipped body armor to the insurgent Free Syrian Army at the start of the Syrian civil war.) As a result, even databases that include civil wars and use a much lower fatality threshold than COW, such as the widely referenced database of the Uppsala Conflict Data Program, may end up giving the erroneous impression that civil wars have become less prevalent when in fact they have become less lethal.

Collecting exact data on the injured in civil wars is admittedly difficult. As a recent report by the nongovernmental organization Action on Armed Violence argues, fewer resources for journalists and increased attacks on aid workers mean that those most likely to report on the wounded are less able to do so today than in the past, leading to a likely undercounting. Dubious statistics thus come out of conflicts such as the Syrian civil war, with media reports suggesting a wounded-to-killed ratio of nearly one to one since 2011. But common sense suggests that the real number of injuries is far higher.

If one ignores these trends and takes the existing databases at face value, the picture is still far from rosy. The tracker managed by the Uppsala Conflict Data Program shows that even according to existing databases that may undercount conflict, the number of active armed conflicts has been ticking up in recent years, and in 2016, it reached its highest point since the end of World War II. And many of today’s conflicts are lasting longer than past conflicts did. Recent spikes of violence in the Democratic Republic of the Congo, Mexico, and Yemen show few signs of abating.

To be sure, the decline of battle deaths, when considered on its own, is a major victory for human welfare. But that achievement is reversible. As the political scientist Bear Braumoeller pointed out in his book Only the Dead, the wars of recent decades may have remained relatively small in size, but there is little reason to expect that trend to continue indefinitely. One need only recall that in the years preceding World War I, Europe was presumed to be in a “long peace.” Neither brief flashes of hostility between European powers, such as the standoff between French and German forces in Morocco in 1911, nor the Balkan Wars of 1912 and 1913 could dispel this notion. Yet these small conflicts turned out to be harbingers of a much more devastating conflagration.

Today, the long shadow of nuclear weapons ostensibly keeps that scenario from repeating. Humanity has stockpiles of nuclear warheads that could wipe out billions of lives, and that terrifying fact, many argue, has kept great-power clashes from boiling over into all-out wars. But the idea that military technology has so altered the dynamics of conflict as to make war inconceivable is not new. In the 1899 book Is War Now Impossible?, the Polish financier and military theorist Jan Gotlib Bloch posited that “the improved deadliness of weapons” meant that “before long you will see they will never fight at all.” And in 1938—just a year before Hitler invaded Poland, and several years before nuclear technology was considered feasible—the American peace advocate Lola Maverick Lloyd warned that “the new miracles of science and technology enable us at last to bring our world some measure of unity; if our generation does not use them for construction, they will be misused to destroy it and all its slowly-won civilization of the past in a new and terrible warfare.”

It may be that nuclear weapons truly have more deterrent potential than past military innovations—and yet these weapons have introduced new ways that states could stumble into a cataclysmic conflict. The United States, for example, keeps its missiles on a “launch on warning” status, meaning that it would launch its missiles on receiving word that an enemy nuclear attack was in progress. That approach is certainly safer than a policy of preemption (whereby the mere belief that an adversary’s strike was imminent would be enough to trigger a U.S. strike). But by keeping nuclear weapons ready to use at a moment’s notice, the current policy still creates the possibility of an accidental launch, perhaps driven by human error or a technical malfunction.
SMALL GREAT WARS

All in all, recent history does not point to a decline of war at large. But what about war between great powers? The historian John Lewis Gaddis famously referred to the post-1945 era as “the long peace.” Deterred by nuclear weapons and locked into a global network of international institutions, great powers have avoided a repeat of the carnage of the two world wars. When the European Union was awarded the Nobel Peace Prize in 2012, it was in part for this remarkable achievement.

We tend to view World Wars I and II as emblematic of war. They are not.

There has, indeed, not been a World War III. But that does not necessarily mean the age of great-power peace is here. In truth, the last century’s world wars are a poor yardstick, as they bore little resemblance to most of the great-power wars that preceded them. The 1859 Franco-Austrian War lasted less than three months; the 1866 Austro-Prussian War was a little over one month long. Each produced fewer than 50,000 battle deaths. Even the 1870–71 Franco-Prussian War, which paved the way for a unified German empire, lasted just six months and resulted in about 200,000 battle deaths. The world wars were orders of magnitude different from those conflicts. World War I was over four years long and produced some nine million battle deaths. World War II lasted six years and led to over 16 million battle deaths.

In other words, World Wars I and II have severely skewed our sense of what war is. Scholars and policymakers tend to view these conflicts as emblematic of war. They are not. Most wars are relatively short, lasting less than six months. They tend to result in 50 or fewer battle deaths per day—a number that pales in comparison to the figures produced during World War I (over 5,000 dead per day) and World War II (over 7,000 per day). In fact, if one excludes these two outliers, the rates of battle deaths from the mid-nineteenth century until 1914 are consistent with those in the decades since 1945.

There have, in fact, been a number of great-power wars since 1945. But they are rarely recognized as such because they did not look like the two world wars. They include the Korean War, in which the United States faced off against forces from China and the Soviet Union, and the Vietnam War, which also pitted the United States against Chinese forces. In both cases, major powers fought each other directly.

The list of recent great-power conflicts grows much longer if one includes instances of proxy warfare. From U.S. support for the mujahideen fighting Soviet forces in Afghanistan during the Cold War to the foreign rivalries playing out in Syria and Ukraine, major powers regularly fight one another using the military labor of others. Outsourcing manpower like this is no recent invention and is in fact a relatively normal feature of great-power war. Consider Napoleon’s march to Russia in 1812. The invasion is famous for the attrition suffered by the Grande Armée as it pushed east. Far less known is that despite its immense size of over 400,000 men, the force was largely not French. Foreign fighters, be they mercenaries or recruits from conquered territories, made up the overall majority of the troops that set off to invade Russia. (Many of them soon tired of marching in the summer heat and abandoned the coalition, shrinking Napoleon’s forces by more than half before he was yet one-quarter of the way through the campaign.) Still, his reliance on foreign troops allowed Napoleon to place the burden of the fighting on non-French, and he reportedly told the Austrian statesman Klemens von Metternich that “the French cannot complain of me; to spare them, I have sacrificed the Germans and the Poles.”


Napoleon in Moscow, 1812, painting by Albrecht AdamWikimedia Commons

Put simply, most violent conflicts, even among great powers, do not look like World War I or II. This is not at all to diminish the importance of those two wars. Understanding how they happened can help avoid future wars or at least limit their scale. But to determine if great-power war is in decline requires a clear conceptual understanding of what such a war is: one that recognizes that World War I and II were unparalleled in scale and scope but not the last instances of great-power conflict—far from it. The behavior of states has not necessarily improved. In truth, the apparent decline in the deadliness of war masks a great deal of belligerent behavior.
DON'T CELEBRATE TOO EARLY

The idea that war is increasingly a thing of the past is not just mistaken; it also enables a harmful brand of triumphalism. War’s ostensible decline does not mean that peace is breaking out. Certainly, the citizens of El Salvador, Guatemala, Honduras, and Venezuela would object to the notion that their countries are peaceful, even though none is technically at war. As the sociologist Johan Galtung has argued, true peace, or “positive peace,” must also contain elements of active engagement and cooperation, and although globalization since the end of the Cold War has linked disparate communities together, there have also been setbacks. Following the collapse of the Berlin Wall, there were fewer than ten border walls in the world. Today, there are over 70, from the fortified U.S.-Mexican border to the fences separating Hungary and Serbia and those between Botswana and Zimbabwe.

It strains credulity that the better angels of our nature are winning when humanity is armed to the teeth.

Even when ongoing wars do come to an end, caution is warranted. Consider civil wars, many of which now end in peace treaties. Some, such as the 2016 Colombian peace deal, are elaborate and ambitious documents that run over 300 pages long and go far beyond standard disarmament processes to address land reform, drug policy, and women’s rights. And yet civil wars that end with peace agreements tend to sink back into armed conflict sooner than those that end without them. Often, what looks to the international community as an orderly end to a conflict is just a means for the warring parties to retrench and regroup before fighting breaks out anew.

Likewise, it strains credulity that the better angels of our nature are winning when humanity is armed to the teeth. Global military expenditures are higher today than during the late Cold War era, even when adjusted for inflation. Given that countries haven’t laid down their arms, it may well be that today’s states are neither more civilized nor inherently peaceful but simply exercising effective deterrence. That raises the same specter as the existence of nuclear weapons: deterrence may hold, but there is a real possibility that it will fail.
FEAR IS GOOD

The greatest danger, however, lies not in a misplaced sense of progress but in complacency—what U.S. Supreme Court Justice Ruth Bader Ginsburg, in a different context, called “throwing away your umbrella in a rainstorm because you are not getting wet.” At a time of U.S.-Russian proxy wars in Syria and Ukraine, rising tensions between the United States and Iran, and an increasingly assertive China, underestimating the risk of future war could lead to fatal mistakes. New technologies, such as unmanned drones and cyberweapons, heighten this danger, as there is no consensus around how states should respond to their use.

Above all, overconfidence about the decline of war may lead states to underestimate how dangerously and quickly any clashes can escalate, with potentially disastrous consequences. It would not be the first time: the European powers that started World War I all set out to wage limited preventive wars, only to be locked into a regional conflagration. In fact, as the historian A. J. P. Taylor observed, “every war between Great Powers . . . started as a preventive war, not a war of conquest.”

A false sense of security could lead today’s leaders to repeat those mistakes. That danger is all the more present in an era of populist leaders who disregard expert advice from diplomats, intelligence communities, and scholars in favor of sound bites. The gutting of the U.S. State Department under President Donald Trump and Trump’s dismissive attitude toward the U.S. intelligence community are but two examples of a larger global trend. The long-term consequences of such behavior are likely to be profound. Repeated enough, the claim that war is in decline could become a self-defeating prophecy, as political leaders engage in bombastic rhetoric, military spectacles, and counterproductive wall building in ways that increase the risk of war.

No comments:

Post a Comment