Pages

14 June 2018

Modifying Situational Awareness: Perfect Knowledge and Precision Are Fantasy

John Q. Bolton

Army Mission Command, significantly influenced by German concepts of Mission Orders, Auftragstaktik, Schwerpunkt, and the Truppenfuehrung (the Wehrmacht’s WWII field manual), emphasizes subordinate initiative within the framework of commander’s intent.[1] Combined with the Army Operating Concept, Mission Command reflects a now-codified common-sense approach to command in a complex environment. However, fully implementing Mission Command within the Army remains a challenge on both conceptual and practical levels. Conceptually, leaders fail to understand how to develop the mutual trust Mission Command requires while subordinates resent any oversight as micromanagement.[2] Practically, Army systems inhibit Mission Command by demanding precision and instantaneous results.

Whether a Prussian/German system is appropriate an army serving a multi-ethnic, diverse democratic society is another debate; this paper is concerned with our modern army’s fascination with statistics, numerical precision, and “Information Dominance.” The Army’s devotion toward analytics, particularly demonstrated by Digital Mission Command Systems (DMCS) like the Command Post of the Future (CPOF) places undue emphasis on data and inhibits the exercise of Mission Command as described in doctrine.[3] This leads to overemphasizingdata and systems to the detriment of analysis and context. Using DMCS as panacea, rather than as means to enhance Mission Command, we expect our digital systems to derive precision from an imprecise, complex world, which inevitably causes frustration and failure. Combined with American educational heuristics, our systems do not prepare us for battlefield chaos.

This paper analyzes how the Army’s Bureaucratic mindset, educational heuristics, and focus on big data negatively affects developing Situational Awareness.[4] It argues that the Army’s bureaucratic mindset common throughout the Army and resident in DMCS presumes an ability to quantify the world based on faulty determinative assumptions. After illustrating the challenges associated with DMCS this paper concludes by describing an alternate framework Soldiers and leaders can use to understand their operational environment or gain Situational Awareness.

Bureaucratic Mindset

Machines don't fight wars. Terrain doesn't fight wars. Humans fight wars. You must get into the mind of humans. That's where the battles are won. - John Boyd

While the Army espouses Mission Command, its systems for managing, tracking, and commanding are overwhelmingly bureaucratic. Sometimes this bureaucracy makes sense, but on the whole it is pernicious to leader development. For example, except at the local level, officer assignment choices are very limited. The personnel system prescribes career paths, which may actually curtail critical thinking across a career.[5] It also reduces, as a matter of convenience, officers to a series of data points—to be interchangeably managed by a revolving series of career managers.

Precision and exact numbers are bureaucratic tenets. Although ADPR 6-0 acknowledges that human not dataexchanges are critical to success, a bureaucratic mindset still permeates both doctrine and operations.[6]Additionally, “the fact remains that the Army’s staff training, exercises, and evaluations are based on [adhering] to processes and doctrine rather than attain[ing] rapid and decisive results.”[7]

This paradigm inhibits rapid decision-making by forcing micromanagement onto organizations yearning for Mission Command. The resulting cognitive dissonance creates resentment because it destroys the trust that GEN Dempsey called “the moral sinew that binds our force together.”[8] Like adherence to deterministic theories, Army pathologies foster a “fear of uncertainty and a squeamish aversion to risk, each of which is anathema to a true mission command philosophy.”[9] Conversely, building implicit trust, while requiring time, can build self-actuating teams based on a shared understanding.[10]

Education vs. Reality: The Natural World Doesn’t Bend to Our Will

The need to quantify and codify everything reflects a pernicious trait of American education. Americans habitually break everything down into parts, assuming that the parts act as composite elements, working together. We assume we can quantify everything. Americans routinely ignore confirmation bias and imprint our methods onto adversaries who do not man, train, or equip forces the same.[11] Our metrics focus on what matters to us, not the enemy.

The American military focuses on equipment and troops; when the enemy may employ civilians and homemade bombs; we develop hierarchical network charts when the enemy operates along tribal and family circles. This a tenet of the American Way of War.[12] In Vietnam, analog computers would confidently declare a village 35% pacified—data that even if somehow accurate, reflected a startling lack of understanding about how local conditions and human actors relate.[13] Now, in Afghanistan, we conduct assessments based on remote sensing, third-party accounts, and, often, conjecture in order to validate assumptions (or desires).

Americans leave school accustomed to physical models largely developed in the late 19th century. Newton gives us simple rules: Force is mass x acceleration; gravity is the attraction between point masses. These rules and models are simple, easy and wrong; our education presumes a determinism that does not exist. Models work well for mechanical systems because we control the environment, reducing chance and friction. But withhuman systems we don’t have this luxury. We may seek to “operationalize big data,” but doing so typically requires environments with predictable conditions and well-defined rules—think Moneyball—not the chaos of combat.



Certainly rarely exists in the real-world, particularly against a thinking, adaptive enemy shrouded by the fog of war. The natural world reveals how quickly simplicity becomes complex and how friction analysis. A simple spring mass has a linear solutions, solvable at the high-school level. Adding just one more element to the system, however, creates a much more difficult problem because the interactions between elements are now complex. Likewise, while 16th century physicists developed ways to predict the motion of two bodies such as the Earth and the Moon, just adding the Sun creates an unsolvable problem. While computers can predict accurate results, interpreting them requires human expertise.

The real world, to say nothing of thinking adversaries, does not conform to our designs. It more often resembles a fractal. Fractals result from repeated simple equations, creating phenomenal shapes. Zooming in on a fractal reveals unending complexity.[14] With fractals—like reality—as we seek to understand more, we are certain about less.


A common theme in complex systems is their sensitivity to initial conditions. Small differences create large, unpredictable results. Because American education teaches simplified models of the world, we become frustrated when things “unfold in an irregular, disorderly, unpredictable manner even though some of our best minds try [to make them] more regular, orderly, and predictable.”[15] But non-linearity, chaos, and unknowns combine to make clear that “general friction will persist more or less undiminished in future war regardless of technological developments.”[16]

The Army is accustomed to specifics regardless of real-world complexity. Plans often lack context (particularly cultural context) and expect precise numerical results.[17] So while Army planners speak of synchronization and the simultaneity of effects, the environment inevitably makes it difficult to do so. We can synchronize mechanical clocks; people are more difficult. In this context, issues with Army DMCS become clear.

Mission Command Systems: Our Computers Lie to Us and We Like It

We know how cruel the truth often is, and we wonder whether the delusion is not more consoling. - Henri Poincare

Battlefield intelligence is fundamentally inductive; we see only bits of the enemy; we see small units or small effects. This forces us to synthesize the enemy’s intentions from composite parts and actions, all of which are unclear. But DMCS are deductive: they start from a big picture and work toward smaller details. DMCS force us to define the broad conditions and, critically, assumptions about the enemy before we even see him. As a result, we frame assumptions implicitly without evidence. This framing restricts our conceptual ability and limits our imagination with regard to the enemy’s capacity, intentions, and actions.

We will require knowledge of foreign languages, cultures, religious beliefs, and above all history—precisely what technocrats ignore because such knowledge cannot be quantified and measured. What matters most in war is what is in the mind of one's adversary.[18] – Williamson Murray, Military Historian

Data emerging on screens drives immediate action, not analysis, because it comes across authoritatively. But this is exactly the opposite of what complex situations require even though Army Doctrine proposes a linear progression from data to understanding. This is problematic because this simple methodology assumes that data is precise, accurate, obtainable, and useful. FM 6-22 Army Leadership (2006) recommends leaders spend time analyzing situations to determine what the real problem is. Leaders should examine a “problem in depth, from multiple points of view,” without settling on the first answer that comes to mind.[19] Data may create a picture, but doesn't generate understanding, just a false sense of knowing. Understanding is more important; developing technology faster than people is dangerous.[20]


Achieving Understanding/Awareness as Defined by Army Doctrine.[21]

Though the Army has always loved data, it evolved into an obsession in the 1990s after victory in the Gulf War and emerging technology caused some to believe that we could achieve Information Dominance—in effect knowing everything. Military leaders, defense analysts, and even some scholars let hubris get the best of them, believing that technology had rendered “history, culture, and the traditional understanding of war irrelevant;” serious scholars echoed this ahistorical judgement. [22] They ignored history and proposed that new technology had created a Revolution in Military Affairs (RMA). According to LTG McMaster, “Concepts with catchy titles such as ‘Shock and Awe’ and ‘Rapid, Decisive Operations’ promised fast, cheap and efficient victories in future war.”[23] One of the strongest RMA advocates was Vice Chairman of the Joint Chiefs Admiral William Owens, who proposed systems that would somehow eliminate the fog of war.[24] Owens echoed the failed technology-driven policies of the McNamara Defense Department which created boondoggles like the F-101 and Igloo White to say nothing of the hubris that escalated the war in Vietnam.[25] During the RMA peak of the 1990s and 2000s, the Army poured billions into the Future Combat System, CPOF, and other systems, some of which were canceled, all of which were or are less than advertised.[26]

Combat operations are always a gamble and we need to rely on the gamblers, not the dice.[27] – COL Mike Pieutracha

In reality these systems play to biases, declaring "Situational Awareness" when we only know the positions of our own forces with certainty. No matter their actual effectiveness, these DMCS speak with authority, giving false confidence that “the system we are using is the most efficient.[28] DMCS concepts rely on the presumption that we can “eliminate the fog of war and obliterate friction with the ‘seamless’ application of some new technology.” As simplifying heuristics failing in the real-world, our DMCS predisposes us frustration against the “reality of warfare when it shows up, shrouded in smoke, beset by friction, and showered in uncertainty.”[29] This emphasis is wrong: the Army is focusing on unproven or undeveloped technology when it should be focusing on training our “people with dynamic scenarios that will reveal both their knowledge of the processes and their willingness to think beyond the checklist.”[30] Focusing on technology inhibits Mission Command.

Even if achievable, Information Dominance was always a misleading goal. Knowing the battlefield does not necessarily translate into success against an active enemy because merely possessing information “is not actually an indication of superiority over an adversary; information is not so much an end in itself as one means among others.”[31] Our systems rarely addressed the pitfalls of too much data. According to COL Mike Pieutracha, an Air Force strategist, “Machines may help categorize what is possible, which is a long way from determining what is correct. Warfare is not an optimization problem.”[32]Our ability to train agile and adaptive leaders, who succeed regardless of technology.

While the Army rolled through Iraqi defenses in both 1991 and 2003 technology only exacerbated differences between American and Iraq forces; what won the day was competence.[33] Rapid success cemented the supposed preeminence of American forces but this was more the exception that proved the rule than a herald of a new form of warfare. Static positions adopted in Iraq afterward cemented an addiction to data. Operating from fixed sites with unlimited bandwidth against an overmatched enemy entrenched a reliance on connectivity that still challenges Army units.[34] Once the enemy adjusted to American systems, rhythms, and limitations our technical superiority didn’t count for much as troops found themselves fighting an ambiguous, lethal enemy hiding among the people.

Conclusion

It’s very difficult to dispense ignorance if you retain arrogance­.­ – GEN Sam Wilson

How do we respond to a battlefield where error, incompleteness, entropy, quantum uncertainty, and human fickleness combine with non-linear, complex systems to create unknowing. To paraphrase President Reagan: “it’s a simple answer after all;” the answer lies in the principles of Mission Command, particularly “building teams through mutual trust” and “creating shared understanding.”[35] Mission Command is not a checklist method. It relies on acceptance of an imperfect, unclear world.[36] It requires substantial trust and understanding between echelons—personal, substantive trust. Current systems cannot replicate this implicit trust—and may often destroy it.[37] Implicit guidance and trust, though harder to develop, can enable unit action much quicker than even the best digital systems.

The Army has forgotten that conflict is chaos. Uncertainly is warfare’s prevalent characteristic. The Army must structure systems and relationships to foster implicit guidance and initiative, rather than collect and demand data. Relying on DCMS and rigid paradigms paralyzes leaders when the displayed information doesn’t correspond with reality. Like a physics student encountering real-world friction for the first time, we may fail to translate our education to real-world usefulness.

The Army must develop a broader conception of Situational Awareness which allows for fog and friction and room for our understanding to change based on conditions, not preconceptions. Making Soldiers’ perception broader and more deliberate will increase the Army’s capability to deal with uncertainty and disorder. Situational Awareness is understanding that allows us to rematch our perceived understanding with events—a continuous reorientation process.[38] Commanders create and sustain shared understanding through collaboration and dialogue within their organizations to facilitate unity of effort. Therefore Situational Awareness is a fluid understanding of the environment, reflecting less the discrete knowns as opposed to deeper facets of the enemy and human terrain. The Army must stop insisting on precision information at the expense of broader understanding. In short, Army training and systems must be comfortable with not knowing and acting without knowledge.

Digital systems can only augment this process, not replace it. The Army should look to “fix” DMCS by eschewing bloated software for traditional, faster, and cheaper analog methods, only augmented when DMCS provide clear benefits. Unit training must focus on preparing Soldiers for complex environments where they will make choices with imperfect information and only vague instructions.[39] Leader training must require officers to build teams and give clearance guidance so subordinates can act without instruction. We much continually “rematch our mental/physical orientation with [the] changing world so that we can continue to thrive and grow in it.”[40] Though broad observations and a continuous reorientation by astute leaders schooled in the principles of Mission Command, we can discern the enemy’s intentions and accustom ourselves (and our plans) to his actions, enabling success far beyond the promises of technical solutions.

The opinions expressed in this article are the author’s and not necessarily those of the U.S. Department of Defense or U.S. Army.

End Notes

[1] Daniel J. Hughes, “Abuses of German Military History,” Military Review 66, no. 12 (December, 1986): 66-76; US Army, ADPR 6-0 Mission Command (Washington DC: US Army, May 2012), 1).

[2] Alan Hastings, “Combating Cynicism in the Ranks: The Need for Critical Thinking in Professional Dialogue,” August 27, 2017, https://www.thetacticalleader.com/blog/combating-cynicism-in-the-ranks

[3] John Bolton, “Overkill: Army Mission Command Systems Inhibit Mission Command,” Small Wars Journal, August 29, 2017, accessed October 31, 2017, http://smallwarsjournal.com/jrnl/art/overkill-army-mission-command-systems-inhibit-mission-command.

[4] Though Situational Understanding is the doctrinal term, Situational Awareness is more common.

[5] Benjamin Ray Ogden, LTC, USA, “Butter Bar to Four Star: Deficiencies in Leader Development,” Joint Forces Quarterly 87, no. 4 (October-December 2017): 46-53.

[6] ADRP 6-0, 3.

[7] Thomas Rebuck, quoted by Best Defense from Mission Command: The Who, What, Where, When and Why, ed. Donald Vandergriff and Stephen Webber (CreateSpace: May, 2017), accessed September 24, 2017, http://foreignpolicy.com/2017/09/12/book-excerpt-why-the-u-s-army-cant-do-mission-command-even-when-it-tries-to.

[8] Martin Dempsey, “Mission Command,” (Joint Chiefs of Staff, 3 April 2012), 6, accessed 18 July 2017,

http://www.jcs.mil/Portals/36/Documents/Publications/missioncommandwhitepaper2012; Hastings, “Combating Cynicism in the Ranks.”

[9] Thomas Rebuck, quoted by Best Defense from Mission Command.

[10] Michael P. Ferguson, “The Mission Command of Islamic State: Deconstructing the Myth of Lone Wolves in the Deep Fight,” Military Review 63, no. 5 (September-October 2017):68-77.

[11] Aaron B. O’Connell, “Our Latest Longest War,” in Our Latest Longest War, ed. Aaron B. O'Connell (Chicago: University of Chicago Press, 2017): 1-25.

[12] Russell Weigley, The American War of War (Indian University Press: 1977), 337.

[13] The Vietnam War, “This Is What We Do,” directed and written by Ken Burns and Lynn Novick, PBS, August, 2017.

[14] Images from the Mandelbrot Set.

[15] John Boyd, “Conceptual Spiral,” ed. Chet Richards and Chuck Spinney, November, 2011, accessed November 1, 2017, http://pogoarchives.org/m/dni/john_boyd_compendium/conceptual-spiral.

[16] Barry D. Watts, "Clausewitzian Friction and Future War," (Washington DC:National Defense University, October, 1996), 112.

[17] Aaron MacLean, “Liberalism Does Its Thing,” in Our Latest Longest War, 85-103.

[18] Williamson Murray, “Clausewitz Out, Computer In,” The National Interest, June 1, 1997, accessed October 18, 2017, https://www.clausewitz.com/readings/Clause&Computers.htm.

[19] US Army, FM 6-22 Army Leadership (Washington DC: US Army, October 2006), 6-1.

[20] Dr. Nicholas Krohley, “Human Analysis in the Age of the Algorithm,” Modern War Institute Podcast. October 11, 2017, https://mwi.usma.edu/category/podcasts/.

[21] US Army, FM 6-0 Commander and Staff Organization and Operations (Washington DC: US Army, May 2014), 3-1.

[22] Murray, “Clausewitz Out, Computer In.”

[23] H.R. McMaster, “Thinking Clearly about War and the Future of Warfare – The US Army Operating Concept ,”International Institute for Strategic Studies, October 23, 2014, accessed October 31, 2017, http://www.iiss.org/en/militarybalanceblog/blogsections/2014-3bea/october-831b/thinking-clearly-about-war-and-the-future-of-warfare-6183

[24] SourceWatch, “William A. Owens,” August, 2008, accessed October 31, 2017, https://www.sourcewatch.org/index.php/William_A._Owens; Murray, “Clausewitz Out, Computer In.”

[25] John Correll, "Igloo White". Air Force Magazine 87 no. 11 (November, 2004): 56–61.

[26] Christopher Drew, “Conflicting Priorities Endanger High-Tech Army Program,” The New York Times, July 19, 2009, accessed October 31, 2017, http://www.nytimes.com/2009/07/20/business/20combat.html

[27] Mike Pietrucha, “Living With Fog and Friction: The Fallacy of Information Superiority,” War on the Rocks, January 7, 2016, accessed October 20, 2017, https://warontherocks.com/2016/01/living-with-fog-and-friction-the-fallacy-of-information-superiority.

[28] Anonymous, “Systems that Strangle,” The Military Leader, accessed October 31, 2017, https://www.themilitaryleader.com/systems-that-strangle.

[29] Pietrucha, “Living With Fog and Friction.”

[30] Anonymous, “Systems That Strangle.”

[31] Pietrucha, “Living With Fog and Friction.”

[32] Ibid.

[33] Stephen Biddle, “Victory Misunderstood: What the Gulf War Tells Us about the Future of Conflict,” International Security 21, no. 2 (Fall 1996):139-179, accessed October 20, 2017, https://muse.jhu.edu/article/447442.

[34] Bolton, “Overkill.”

[35] ADRP 6-0, 2-1.

[36] David Caligari, “Trusting Imperfection: Getting Mission Command to Succeed,” Grounded Curiosity, accessed November 1, 2017, http://groundedcuriosity.com/trusting-imperfection-getting-mission-command-to-succeed.

[37] Bolton, “Overkill.”

[38] John Boyd, “Destruction and Creation,” September 3, 1976, accessed November 1, 2017, http://pogoarchives.org/m/dni/john_boyd_compendium/destruction_and_creation.pdf.

[39] Pietrucha, “Living With Fog and Friction.”

[40] Boyd, “Destruction and Creation.”

John Bolton is the Deputy G3 for Train Advise Assist Command-East (4/25 IBCT (A)). He is a graduate of the Command and General Staff College’s Art of War Scholars Program and holds degrees from West Point and American Military University. His assignments include 1st Engineer Battalion and 1-1 Attack Reconnaissance Battalion with deployments to Iraq and Afghanistan. The views presented here are his alone and not representative of the U.S. Army, the Defense Department, or the U.S. government.

1 comment:

Bharath K Bhat said...

A problem well understood is, a problem half solved.

Post a Comment