Pages

30 June 2014

OVER-RELIANCE ON ARMED DRONES IS A SIGNIFICANT STRATEGIC MISTAKE

June 27, 2014 ·

Three Common Misperceptions About Drones

Dr. Janine Davidson, a Senior Fellow at the Council On Foreign Relations, has an article in DefenseOne.com, with the title above. Dr. Davidson was a member of a Stimson Center Panel, headed by former Commander, USCENTCOM, Gen. (ret.) John Abizaid and former Pentagon official Rosa Brooks, that released a report yesterday, (June 26, 2014) on the U.S.’s employment and use of armed drones. Ms. Davidson notes that the nearly year-long examination looked at “three key sets of issues revolving the use of unmanned aerial vehicles (UAVs): 1) defense utility, national security, and economics; 2) ethics and law; and, 3) export controls and regulatory challenges.” Dr. Davidson notes that their report “identified UAV misconceptions; areas of concern; and, a few concrete ways,” to improve overall U.S. strategy with respect to autonomous systems. A copy of the report is attached.

Common Misperceptions/Myths Regarding The Use Of Drones

UAV’s Do Not “Cause” Disproportionately High Casualties

“Contrary to popular belief,” the panel notes, “armed UAVs are precision platforms: their weapons go where they’re directed. Collateral damage therefore, is due to the high-risk mission set which UAVs are assigned — [and], not a consequence of the platform itself. Manned aircraft have similar vulnerabilities.

UAVs Are Not Inherently Cheaper Than Manned Aircraft

“The tail created by UAV personnel is considerable; but, rarely factored into the cost of the platform. Significantly, the higher cost of manned aircraft, also reflects greater capability. There are many things UAVs can do more cheaply; but, [there are some] significant functions that they can’t perform at all. Fundamentally, it remains an ‘Apples to Oranges,’ comparison,” the Panel concluded.

Most UAVs Aren’t Weaponized

“The Department of Defense (DoD) currently operates about 8K UAVs. Less than one percent of that total — [actually] carry operational weapons at any given time. The typical UAV mission remains intelligence, surveillance, and reconnaissance (ISR) — not combat.

Meanwhile, There Are Some Significant Areas Of Concern

Continuing Advances In UAV Technology

“As technology advances, U.S. policy-makers will be increasingly faced with the vexing question of robotic autonomy in wartime theaters. They will need to tighten down export controls, — without undermining innovation. Perhaps most significantly, they will be increasingly tempted to use UAV’s as an instrument of force as they get easier and easier to employ — without [directly] risking American lives.”

Targeted Strikes And Strategic Risk

“Targeted killings remain a questionable pillar of overall U.S. counterterrorism strategy. The strategic utility is often unclear, while frequent cross-border strikes erode local national sovereignty; [and,] might even be counterproductive in the long-term. This is to say nothing of the terrible blowback incurred by strikes with collateral damage.

Basic Legal And Ethical Issues

“The lack of governmental transparency in UAV employment remains a deeply troubling phenomena, including even basic information as to why individuals are targeted. The United States’ wide-ranging use of targeted killings, also flies in the face of international law; and, sets a precedent other nations might one day follow (and, not to our benefit).

Dr. Davidson notes that the Panel concluded: that UAV’s should ultimately be “neither glorified, nor demonized.”

Among the Panel’s recommendations:


– Continue transfer of general UAV responsibility from the CIA to the uniformed services. At best, parallel CIA and military UAV programs are duplicative and inefficient. At worst, they complicate oversight; and, increase chances of error — due to different standards and requirements. Lethal UAV strikes should be arbitrated through a single, integrated system.

– Improve transparency in targeted UAV strikes. While secrecy may be required before individual strikes, these strikes must be acknowledged and disclosed — after the fact. A broad, secret, multi-year UAV strike program runs counter to American values, and democratic rule of law.

– Conduct a strategic review of lethal UAVs in targeted strikes. This issue should be further developed in an interagency, strategic review, evaluating the costs and benefits of issues identified here (and many more in the actual report).

The Panel concluded that: “There are many more misconceptions, concerns, and recommendations, identified in the full report. This review comes on the heels of another excellent study put out by the Council on Foreign Relation’s Micah Zenko, and Sarah Krepps. The issue of targeted UAV strikes is timely and important; and, will only grow larger as time goes on.”

Some Observations

The U.S. use and employment of armed drones has troubled me for some time. There is no doubt a time and a place for targeted killing; and, there is little doubt that such strikes have made Americans safer — both here and abroad. But, that safety has come with a high price. As Mark Mazzetti wrote in the June 26, 2014 New York Times, “the Obama administration’s embrace of targeted killings using armed drones, — risks putting the United States on a “slippery slope” into ‘perpetual war;’ and, sets a dangerous precedent,” for similar kinds of lethal operations that other countries may soon adopt.” The Stimson Panel also noted that “more than a decade into the era of [modern] armed drones, the U.S. Government has yet to carry out a thorough analysis of whether the costs of routine, secret killing operations — outweigh the benefits.”

And, autonomous system technology and proliferation is accelerating and within the next ten years — nation-states, militant groups, others, — without drones — will be the exception.

If you have read my previous articles when referring to the use of drones and autonomous systems on the battlefield, you are likely aware of my deep concerns about the potential for overreliance as well as ease of use. Making a decision to use military force should always be hard. The increased use of drones and robots on the “battlefield” has made the decision for military intervention — such as in Libya during Qadaffi’s fall — as well as in the war on terror — easier in my judgment. Autonomous warfare is. to a large degree antiseptic, distant, and provides a false sense of security, as well as disguises the true ” ” (emotional, financial, human, etc.) war imposes. It erodes the warrior ethos; and, makes us strategically lazy. A decision to engage militarily is easier to reach — if the calculation is that the cost in “men” is virtually zero.

As Linda Palermo wrote in First Post, “The Danger of Drones – And A War Without Risk,” as with the concept of moral hazard in financial markets,(i.e., if you know you are going to be bailed-out, your choice of investments will be impaired) knowing that your tactics will have an actual effect on your soldiers will change your military decisions.”

Ms. Palermo asks, “If we remove risk of loss from the decision-makers’ calculations when considering crisis management options,” do we make use military intervention ‘easier,’ and more attractive? Will decision-makers resort to war as a policy option far sooner than in the past? Rick Fisher, a Senior Fellow at the International Assessment and Strategy Center in London suggests it might. “Perhaps in the future, China might be tempted to shoot down a U.S. Global Hawk unmanned surveillance aircraft flying over the disputes island chain in the South China Sea, because it might calculate that Washington would not escalate a response over a drone.”

Peter Singer, author of “Wired For War,” recently wrote, “when historians look back on this period, they may conclude that we are today at the start of the greatest revolution that warfare has seen since the introduction of the atomic bomb. Our new unmanned systems just don’t just affect how of warfighting is done; but, it also affects the who of the fighting at the most fundamental level. That is, every previous revolution in war was about weapons that could shoot quicker or, had a bigger boom.’ “Humankind,” he says, “is starting to lose its 5,000-year-old monopoly of the fighting war.”

If Moore’s Law is applied to the current state of “warrior robots,” these machines will be 1B times more powerful twenty-five years from now, argues Mr. Singer.

And, the idea that we will always have the monopoly on having the technological advantage in this area is fatuous. Mr. Singer warns that it was the British and the French who invented the battlefield tank; but, it was the Germans who understood how to employ it more lethally and, strategically. He adds, It was “the Chinese and the Turks who first used gunpowder,” but, it was the Europeans who revolutionized its battlefield (land and sea) use. A Jihadi website now offers instructions on how to detonate an IED in Afghanistan from one’s home computer. As Albert Einstein once said, imagination is more powerful than knowledge. Technical and/or capability surprise are more often than not, — not the most important game-hangers on the battlefield. Rather, we are much more vulnerable to missing clever and creative ways technologies — we understand very well — can be use in ways we don’t understand very well, and/or, failed to anticipate/visualize. Remember the adversary gets a vote.

Robots and autonomous systems certainly have their place in modern warfare. But, we must be wary of fighting from the “comforts” of a military base in Nevada. As Mr. Singer is fond of saying, “we are watching more; but, experiencing less.”

Mr. Singer cites the late, great, science fiction writer Arthur C. Clark’s story of “Superiority,’ which is set in the distant future, as an example of how our embracement of robots and technology on the battlefield can end in defeat. In Superiority, a much more, technologically advanced opponent, was defeated by an opponent that was technologically-challenged. As a military officer from the more technologically advanced force sits in a prison cell contemplating his country’s impending defeat, he remarks, “we were defeated by one thing only — by the inferior science of our enemies.” The soldier goes on to explain “our side was seduced by the possibilities of new technology. We planned for how we wanted war to be, rather than how it turned out.” A fatal violation of one of Clausewitz’s key principals of war.

Mr. Singer presciently concludes, “the robotics trend is revolutionary, but it also doesn’t change the underlying fundamentals of war. The fog of war remains. While you may have Moore’s Law, you can’t get rid of Murphy’s Law.

Over-Reliance On Armed Drones And Autonomous Systems Is A Significant Strategic Mistake

Budgetary constraints, coupled with a strategic and measurable shift towards armed drones and autonomous systems is coming at the expense of manned aircraft as well as ground forces — a significant strategic mistake. Too many of our defense leaders and others mistakenly believe that advancements in drone technology and autonomous systems; coupled with use of Special Operations forces and good intelligence — will be enough to defeat most adversaries in the “battlefield” of the future. Indeed, the use of drones is an elixir to an antiseptic engagement — with limited collateral damage, limited casualties, and limited duration. But, the adversary gets a vote.

By elevating armed drones and autonomous systems at the apparent expense of our ground forces — we are undermining our ability; and, flexibility to respond to future unexpected and unanticipated types of military engagement. It is what cognitive scientists call “motivated errors.” Our defense leadership, members of Congress, our national security establishment become bonded by their common, intense desires that particular military approach to a gnarly problem — be proven to be effective. As a result, there is a great danger of highly unrealistic planning and assumptions; and, an unwillingness to learn from counterexamples — and, an avoidance of developing any “Plan B,” should “Plan A,” fail.

There are no doubt, powerful, converging political and defense motivations among some in the senior (U.S.) civilian and military leadership — to prepare for future conflicts by embracing a swift, technologically-advanced, ground-force light conflict/combat strategy. The Intelligence Community can warn the senior leadership when it thinks some of that faulty thinking may be diverging from what our intelligence is telling us; but, fundamentally, the Intelligence Community cannot do anything when powerfully motivated senior leaders become wedded to a certain genre of technology (UAVs and autonomous systems); and, find these distortions more attractive than dealing with reality.

The Romans used to say if you want peace — prepare for war. It seems we’re preparing for best case scenarios and the war “we want,” as opposed to the one we’re more likely to get. It was Plato who said that “the only men whom have seen the last war are those who are already dead.” And, as Stephen King, the great horror-thriller fiction writer once wrote, “God Punishes Us For What We Cannot Imagine.

No comments:

Post a Comment