Pages

10 April 2014

THE AUTONOMY QUESTION AND THE FUTURE BATTLEFIELD

April 7, 2014 · by Fortuna's Corner
The Autonomy Question: Where Humans Should Step Aside And Let Machines Take Over?


Rebecca Grant has an online article in the April 2014 Air Force Magazine, with the title above. She writes that “remotely piloted aircraft such as the MQ-9 Reaper, and the RQ-4 Global Hawk are manned by squadrons of pilots and sensor operators on the ground.” “Five or ten years from now however,” she writes, “that may no longer be the case, as full autonomy for air vehicles is well within the Air Force’s reach. According to USAF officials, artificial intelligence and other technology advances will enable unmanned systems to make and execute complex decisions required for full autonomy — sometime in the decade after 2015.”

“Advances in information management, vehicles, and weapons have opened the door to highly complex applications of autonomy — with far less human intervention in the mission timeline. Threat is a driver too: Technical advances in autonomy can improve reaction time and chances for mission success in a contested or denied airspace,” she adds. “The Pentagon says full speed ahead. In November 2012, Deputy Secretary of Defense Ashton Carton issued new guidelines on autonomous weapons development. The guidelines authorized Combatant Commanders to incorporate more weapons systems with autonomy — into operational missions.”

“The intent,” she writes, “was to pursue operational advantages and “allow commanders and operators to exercise appropriate levels of human judgment in the use of force,” according to the policy directive. “Two more thumbs up came,” she writes, “came from the Under Secretary of Defense for Acquisition, Technology, and Logistics, Frank Kendall III, and the Vice Chairman of the Joint Chiefs, ADM James Winnfield, when they released an updated, unmanned systems roadmap in 2013.”

“Autonomy,” she notes, “in unmanned systems will be critical to future conflicts that will be fought and won with technology,” the roadmap said. Autonomy refers to what the machine can do by itself. The concept started out as a way to reduce the workload of human operators — by transferring partial operations to a machine process-e.g., an airplanes autopilot mode.”

Dark And Light

“Autonomy technologies stand to make a major difference in the contested battlespace-but, they will be contested in the public debate as well. Increasing levels of autonomy stir controversy when they touch on deep-seated fears and values surrounding the use of force. At issue, is whether repositioning the elements of human control, alters the concept of legitimate action. Discomfort persists. “Drones are a technological step that further isolates the American people from military action,” said law professor Mary Dudziak to The New Yorker in 2009. “Intriguingly, there is a vocal group on the other side, too,” she notes. “These scientists see autonomy as a means to reduce error and enhance the legitimacy of the use of force. While some decry the growth of autonomy, others have pointed out it can subtract human weaknesses from combat. Full-scale robots “would be unaffected by the emotions, adrenaline, and stress that cause soldiers to overreact, or deliberately overstep the rules of engagement,” hypothesized a California Polytechnic State University sponsored by the Office of Naval Research. These robots could even “act as objective, unblinking observers on the battlefield, reporting any unethical behavior back to command headquarters. Taken to the extreme, autonomy theoretically enhances legitimacy. “Future generations may come to regard tactical warfare as properly the business of machines and not appropriate for people at all,” noted Thomas Adams, in a 2009 article for the U.S. Army War College’s journal, Parameters, reprinted in 2011.

“A consensus on he proper roles for autonomy is lagging behind the technical possibilities. Does pre-emptive launch attacks against missile launch site by an autonomous system fit the criteria? Would having human commanders set the mission parameters skate under the barrier; or, does the input have to take place within a specified time? The point is that sanctioning autonomy — only as a defensive weapon will soon be too small a fig leaf. Questions about offensive weapons cannot be avoided,” Ms. Grant argues.

“One way ahead,” she argues, “could be to subject autonomous systems to blue-suit evaluation and discipline. Writing in 2002, an Air Force Research Laboratory AFRL) team took on the challenge of setting up autonomy metrics. “The great insight was this: “We are designing algorithms, agents if you will, to replace pilot decision functions. Machines replace humans, so why not look at the human effectiveness community for metrics? The AFRL team pointed to the OODA (observe, orient, decide, and act) Loop as an obvious choice for the Air Force. But, the teams insight is broader. There’s every chance to keep ethics and efficiency in the loop.”

At the end of the day, “the real dilemma is not the current level of autonomous systems,” she writes. “The next applications of autonomy could greatly decrease the human crew intervention in the mission timeline.”

In 2012, the Defense Science Board (DSB) completed a study on autonomy commissioned by the Deputy Secretary of Defense. The starting point was that autonomy is here to stay. “Unmanned vehicle technologies, even with limited autonomous capabilities, have proven their value to DoD operations,” according to the report, “The Role Of Autonomy In DoD Operations.” “The study then raised the issue of finding the appropriate cognitive level for handoffs between human control and software autonomy.”

Forming Up

All current unmanned DoD systems are remotely operated; they can default to true automation “only briefly.” and in “extreme circumstances, such as a lost-link condition,” the report stated. Making this observation and distinction “is important because our community vernacular often uses the term ‘autonomy’ to incorrectly describe automated operations,” the report said.

Ms. Grant argues that “the debate on autonomy is likely to heat up, as the near future holds both technological advances and mission requirements that will keep the spotlight on this,” domain. The ability to “talk” among unmanned systems, dynamic tasking, activating based on target movement, going dormant when there is no activity; and/or, camouflaging itself like a chameleon when in hostile/denied areas are just some of the capabilities for unmanned systems — that isn’t that far away. Teams, or “swarms” of multiple vehicles, coordinating movement with little to no human intervention — “is an alluring concept of operations,” notes Ms. Grant.

Bottom Line

Ms. Grant concludes, “to be sure, there are still many technical hurdles to clear as the use and employment of autonomous systems increases. Certain key enablers must be available in order to realize the full benefits, according to DoD. The list includes mission planning that is easy to change, guaranteed precision navigation, and timing, better cross-cueing by sensors — both onboard and off-board; and, the major issue of how and when to disseminate data from autonomous systems engaged in a battle. Efficient use of bandwidth for data transmission is another major concern. Add in contested environments, false targets, and an information-savvy foe; and, the need for autonomous information processing — could grow by leaps and bounds.”

Interesting thoughts from Ms. Grant. Pete Singer, author of “Conflict in the 21st Century,” persuasively argues that an “amazing transformation is taking place on the battlefield. Remote controlled drones take out terrorists in Afghanistan, to conducting intelligence, surveillance, and reconnaissance missions around the globe, to delivering supplies and pre-positioning military equipment.

“Science fiction on the battlefield is becoming more of a reality,” he observes. “Something big is happening on the battlefield today; and, maybe in the history of humanity itself.” And, he says “we need to remember we’re” at the stage of the Model-T Ford — when comparing where we are with respect to drone technology. Tens of thousands of robots on the battlefield is where we’re headed in future wars,” he notes. “In 25yrs., if Moore’s Law holds true, those robots will be close to 1B times more powerful than their computing power today. The things we used to see in science-fiction movies — now need to be discussed in the halls of power,” he argues.

“When historians look back at this time-period, they may well call this a Revolution In War timeframe — maybe even more profound as the invention of the atomic bomb in the 1940s. Every previous signature period in warfare: The French Long Bow, the invention of gun-powder, aircraft, precision-guided bombs, etc., all changed the nature and character of war. Robotics and the use of autonomous systems appears to fit into this same category.

“The use/employment of autonomous systems by the military is making the decision to engage militarily “easier, and blinds us to the true costs of war. The ability to watch more; but, experience less, — puts a wrinkle between those prosecuting the war and, those on the receiving end. How does the use of armed drones against al-Qaeda play out in the battlefield of ideas?” he asks. What is the message we think we are sending with these machines, versus what is being received in terms of the “message” received by the adversary?”

As one Lebanese journalist put it to Mr. Singer a few years back, the use of armed drones to do our “dirty work,” was “just another sign of the cold-hearted, cruel, U.S. and Israeli” character, — who are cowards, because they send out machines to fight us. They don’t want to fight real men; because, they are afraid to fight. So, we just have to kill a few of their soldiers to defeat them.”

Call me a skeptic when it comes to autonomous systems. It is clear that these machines will be extremely beneficial in all aspects of our lives, as well as on the battlefield and in the realm of intelligence collection. But, technology isn’t going to win wars. While future advances in autonomous capabilities and employment on future “fields” conflict may prove decisive in certain battles. It is character, ingenuity, risk-taking, courage, and the warrior ethos will remain the key ingredients to victory on the battlefield’s of the future. If we lose our warrior ethos and those aspects of human emotions that make us both “love” and “hate” conflict, as well as understand its terrible “price-tag,” then we will inevitably lose the war. Lots to think about. V/R, RCP

No comments:

Post a Comment