Pages

21 February 2021

Goodhart’s Law: Why the future of conflict will not be data-driven

 ZAC ROGERS

Data is the future. Few tropes receive more uncritical acceptance. Pick any rubric of military assessment– take PMESII (political, military, economic, social, information, infrastructure) and DIME (diplomatic, informational, military, and economic) as obvious examples– and you will find analysts, operators, commanders and civilian authorities, convinced the key element of future operational and strategic efficacy lies with harnessing data. Gathering, collating, analysing, and acting on data, more efficaciously than the competitor, is central to institutional expectations of the digital age in just about any contemporary field of endeavor– military or otherwise.

As conflict in the ‘grey zone’ has blurred traditional boundaries between cooperation, competition and conflict, compressing time, space, lines of effort, phases, and domains, increasingly we are certain victory lies in connecting bridges across data streams. Lieutenant General John Shanahan, former Director of the U.S. Department Of Defense Joint Artificial Intelligence Center has previously warned against being left behind by competitors in this race to mine military advantage from the data. Prime Minister Scott Morrison echoed these assertions in his vision for Australia’s economic future. Few question the assumption that data is the key to unlocking strategic success.

But is it true?

‘Data’ is a recorded digital abstraction of a state of the world. It has a binary structure of bits represented as ones and zeros. For any state of the world to be recorded, stored, and transmitted by any digital instrument, it must be abstracted into this structure. First problem: reality is not digital. Nor is it running a digital operating system.

Whether the state of the world we are interested in is biological, social, political, economic, psychological, or whatever, theoretically it may be describable as executing a digital operating system, but it is not. Any resemblance to such an operating system is not evidence of its existence. Nature, uniquely, seems to be able to slide between binary, trinary, quantum, and analogue operating systems seamlessly. Humankind’s chief scientific and technological achievement so far is in building machines with the rigidly digital operating systems we call computers.

Defence scientists and technologists excel at modelling states of the world. For that reason, they love data. Because of its structure, it renders states of the world as mathematically computable, which is to say, modelable. More data begets more models– more modelling begets the drive for more data. More data and better models, applied appropriately to contemporary forms of conflict, are expected to lead to better outcomes for the warfighter. We say we understand the limits of knowledge within complexity, but we gather and model nonetheless, seeking better understanding and, hopefully, better outcomes.

The second problem is defence scientists are interested in states of the world which are not digital, and when recorded digitally include an abstraction we cannot fully account for. So what do these models actually tell us? Defence scientists look for ‘invariances’ within mathematical models- opportunities to steer systems in favourable directions despite adversarial conditions like those expected to be found on the battlefield. But the level of abstraction involved is problematic. Invariance is the ‘property of constancy despite changes in the conditions of measurement.’ But when analyzing big data for clues about how systems will act, the conditions of measurement are already distorted. Is there a flaw in the data-driven paradigm we might have overlooked? And by overlooking it as we accelerate to a data-driven future, are we making it worse?

Goodhart’s Law

Originally formulated, Goodhart’s Law states that ‘any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.’ Anthropologist

Marilyn Strathern put it in simpler terms, “when any measure becomes a target, it ceases to be a good measure”.

In our case, the data obsession has transformed our measuring methodology into the target of our efforts. All our efforts are targeted at generating, gathering, and analysing more data– we have long since handed off the task to machine learning and other forms of fast processing. But before we even get to the issues created by A.I., discourse on which dominates so much of the contemporary discussion, the deeper problem with data has gone largely unseen.

Why does Goodhart’s Law matter?

Any measuring tool or methodology is inevitably incomplete. Kurt Gödel’s Incompleteness Theorem showed this to be true mathematically. By allowing the measure to become the target of our efforts, that incompleteness gets magnified.

Being data-driven risks allowing everything we don’t understand about the state of the world we are measuring to creep in unseen. We won’t anticipate creeping effects because, in sum, it is not in the data. But that is the only place we can think of to look. This is the warning Goodhart’s Law confers on us, and it seems prescient in any number of contexts. In military and strategic affairs, it is profound.

Data is the language of machines; it is the conduit through which machines witness the world. This portends the context in which data gathering and analytics belongs. Assigned and specified problems, in which the abstraction inherent in digital rendering does not compound uncertainty, is where data-driven planning and analysis belongs.

Imagine a fleet of vehicles with sensors gathering data about wear and tear of critical components. The degree of wear in a brake pad or drive belt can be captured in abstract binary data without losing critical information, or distorting the context about that state of the world. When gathered and analysed at scale, the captured data can express correlations between states of the world that make data-driven predictive maintenance of those vehicles more efficient and accurate than existing methods.

The same cannot be said for many other types of systems. In general, and contrary to various popular philosophies like those depicted in IF THEN: How the Simulmatics Corporation Invented the Future, systems in which information about states of the world involve human reflexivity do not present assigned and specified metrics.

The contrast with vehicle components is easy to see. The state of wear on a brake pad can be expressed on a scale of 1 – 5 without losing or distorting too much. The state of anger, fear, frustration, loyalty, pride, or enmity, within a group of people cannot be expressed in a similar way without introducing radical distortion. Averaging out the data to produce a probability metric, so that a mathematical model can be produced, does not tell us about the state of world we are observing. It tells us about the state of our data, and the state of our methodology.

Warning signs

In the parallel context of commerce, these data-driven tools and methods are being exposed as dubious. The attention-harvesting ad tech market is heading for collapse, likely wiping out $billions in stock market valuations and destroying an entire business model.

Behind the commercial, political, and legal drama, the knowledge fields that inform and legitimise these practices receive little public scrutiny. But it has long been known that psychometrics was ‘born to be abused.’ And the science of human attention is not settled. In military and strategic affairs, where some of the same tools and methods are being experimented with and studied within defence science communities, a warning beacon is flashing red.

A common misunderstanding is that humans and machines can be integrated in ways that optimise the best features of both. On closer inspection, this assumption relies on data and modelling as the bridge. But by mixing the contexts in which humans and machines witness reality without acknowledging the epistemic gap, uncertainty enters the frame unbridled.

If data is to have a productive future in the art and science of human conflict, it will be because we learn the appropriate ways to keep it in its lane. That lane might be narrower than many think. To exploit its utility, data-driven modelling must be weaved together with an expanded approach to knowledge in and of human conflict. The problem highlighted by Goodhart’s Law lies in how the digital age tends to delimit our approaches, not expand them.

When Clausewitz evoked the Trinity in matters of war – of reason, passion, and chance – he observed that war could not be made subordinate to policy due to the interplay of these seminal elements. Nothing offered by data and modelling alone alters this axiom. In fact, the risk is of introducing even greater uncertainty to the warfighter. Technology creates as many wards of disorder as it does masters.

About the Author: Dr Zac Rogers PhD is Research Lead at the Jeff Bleich Centre for the US Alliance in Digital Technology, Security, and Governance at Flinders University of South Australia. His research combines a traditional grounding in national security, intelligence, and defence with emerging fields of social cybersecurity, digital anthropology, and democratic resilience. Find him on Twitter and LinkedIn.

No comments:

Post a Comment