4 April 2016

READ MY MIND: WHY IT’S HARD TO SEE THINGS FROM THE ENEMY’S POINT OF VIEW

MARCH 31, 2016

Earlier this year, we marked the 25th anniversary of the Gulf War and the beginning of 25 years of continuous U.S. military operations in the Middle East. Many forget the beginning of this involvement: Saddam Hussein invaded Kuwait because he thought that he could get away with it. Hussein, however, was wrong. An international coalition led by the United States pushed his forces out of Kuwait. Was the problem a failure of communication? And if so, was it possible to correctly communicate to Hussein that the United States would respond with force if he invaded Kuwait?

While some would argue that better cultural understanding would have made the difference, it is likely that the strategist will always struggle to get inside the enemy’s head. The challenge for any analyst or practitioner of strategy lies in coping with the reality that strategy nonetheless demands that we try.

Strategy and Other Minds


As records collected by the National Defense University indicate, Western and Iraqi perceptions of the Gulf War differed dramatically. In the records, Hussein also argues that, merely by obtaining a cease-fire, he had won the war. However, the matter of whether there was a failure to strongly communicate U.S. intentions to Hussein is still controversial. One interpretation is that the George H.W. Bush administration had not reached an internal consensus about what to do about Hussein, and American diplomatic communication reflected this. Another argument is thatthe Bush administration was far more concerned with mollifying Hussein’s grievances than credibly deterring him. Because of a paucity of documentary evidence, this question is unlikely to ever be conclusively resolved.

These types of problems, many argue, reflect dramatic levels of cultural and regional ignorance within the United States government. Certainly this cannot be disputed; the United States government — due to a variety of mostly self-inflicted failures, cannot mobilize the necessary knowledge and expertise to handle modern security problems. However, we should nonetheless doubt that all of the cultural expertise in the world will help us figure out what our enemies are thinking. To be clear: Cultural understanding is of the utmost importance. However, when we move beyond the Hussein example, we can see that a cultural understanding is not a silver bullet.

The fallacy to avoid is the notion that, if only we collected more information and had more culturally literate experts, we would not be surprised — or at the very minimum, would be surprised far less than we are today. There are many reasons that we should doubt the utility of more information and more experts, some of which lie in the nature ofperception and misperception in international politics. One particular problem is that dictators like Hussein have incentives to misrepresent or conceal relevant information. The perfect political manipulator should be difficult if not impossible for opponents and potential opponents to understand, explain, and predict. Hussein, after all, may have sealed his own fate by simultaneously attempting to project strength to his Iranian adversaries and denying that he had weapons of mass destruction (WMD). It is an exaggeration to say that post-2003 interviews reveal that he wanted the world to think that he had WMD. But the Iraqi dictator was undoubtedly engaged in a futile balancing act between the need to deceive the Iranians about his true military capabilities and the equally pressing need to show the world enough about those capabilities to prevent an American invasion.

It should go without saying that even without all of this calculation and complexity, the closed and secretive nature of the Iraqi regime makes any kind of political or military analysis of what a dictator might be up to highly difficult. Even the best intelligence assessments of regime intentions and decision-making often amount to little more than crude guesses.

The problem thus only partially involves cultural and regional knowledge. Lawrence Freedman argues that the capacity for mentalizing — making decisions based on the goal, the environment, the opponent, and what we believe the opponent believes about us — is a key element of strategy. To truly and completely put ourselves in the shoes of a dictator like Hussein, we have to somehow reason about how Hussein considers all of these matters (and more). And in this case, the challenge may not be considering all of the relevant factors, but simply knowing the right place to stop considering relevant factors. We can learn much about this problem from what psychology, cognitive science, and artificial intelligence say about why thinking about complex social situations and the minds of others is so difficult.

Just Folks: Folk Psychology and Higher-Order Belief

Philosophy, cognitive science, and artificial intelligence all study the complexity of knowledge and belief. Much of this literature is rooted in folk psychology, the idea that we have crude internal mental models of ourselves and the people we interact with. Folk psychology is pervasive in the everyday language we use to make sense of the world.

For example, when Afshon Ostovar states that Iran “gave back the sailors,” he is engaging in the common folk psychology practice of regarding a group (Iran) as an agent. Surely not all of the Iranians — which would include every Iranian from a baby cooing in the delivery room of a Tehran hospital to Foreign Minister Javad Zarif himself — are engaged in the process of giving back the detained American sailors. Nor does Ostovar mean that both Iranian elites and Iranian college students are “dealing with the aftermath of a historic diplomatic agreement.”

Even when Ostovar describes the narrower and less heterogeneous Iranian Revolutionary Guard Corps (IRGC), the same pattern crops up. The IRGC “[does] not want a moderation in foreign policy,” and “sees [the Persian Gulf] as an active combat zone.” What does it mean for a group to “see” something? Can a group (as opposed to its elites or its rank and file) “want” something? Hence, folk psychology crops up all the time, both as a simplification that we use in everyday speech and discourse, and as a legal concept that regards certain kinds of group agents (corporations and states) as being equivalent to and/or understandable as individuals.

Informally, much of folk psychology deals with the way in which we can predict and explain behavior. Sometimes we make recourse to what someone else “believes” in order to do so. If Country X “believes” that Country Y believes that X is hostile, X might have good reason to expect Y to act based on that belief. Of course, the fact that we can have beliefs about what other people believe suggests a somewhat recursive quality to human thinking. And this is where theories about higher-order beliefs come in. Interpersonal interaction likely involves sophisticated reasoning about other people’s reasoning.

Any kind of military decision, for example, is a wager about what the enemy will do. But it is hard to predict what the enemy will do without taking into account what the enemy believes we will do, and so on. This can get pretty complex, and the challenge — given that all of this amounts to a guess about what we cannot really know until after the fact (if ever, in many cases) — is to make it tractable in our everyday reasoning.

“Men should be what they seem”

Perhaps the best way to understand the problem is to look at what we need to understand a relatively simple story. As one writer looking at childhood cognition and learning does, let’s take Shakespeare’s Othello, for example. We all “know” the plot of Othello — a virtuous soldier (Othello) is manipulated into killing his beloved wife Desdemona by a jealous and scheming false friend (Iago) and his buddies. But what do we need to know in order to “know” the narrative? Much more than we think, as it turns out. Othello, as a play, is multilayered in the narrative demands it places on the audience. At one point, the traitorous Iago says to Othello, “men should be what they seem,” an ironic statement given that Iago is a liar attempting to manipulate a friend into seeing him as trustworthy.

Even in this simple example, we see varying orders of belief, such as what Othello and Iago believe and don’t believe and what we, the audience, believe and don’t believe. In general, with the play’s narrative we can begin at the lowest order of belief and count each time we need another level of belief to understand it. The reader will notice that everything from second-order beliefs to fifth-order beliefs in the example involve understanding, at a minimum, a character’s thoughts about another character’s thoughts. It is the fact that men are not “what they seem” that makes such complex cognitive models, and models of models, necessary.
Desdemona is made to appear to Othello as deceptive and traitorous (first-order belief).
Othello’s enemies, who Othello mistakenly believes to be his friends, have planted these false beliefs (second- and third-order beliefs).
The audience knows the truth, understands the plot against Othello by his false friends, and can visualize the conspirators’ mindsets (fourth- and fifth-order beliefs).

The reader will at this point likely note that we, the audience, are not only shown what Othello (and other characters) cannot observe, but also have privileged access to the characters’ beliefs, desires, and intentions to an unrealistic degree. After all, it is understandable that Othello would not see Iago’s villainy when another Shakespearean supervillain, Richard the III, infamously lays bare his entire set of diabolical plans to the audience in monologues, like his “winter of discontent” speech. His victims, however, can only make crude guesses as to what he is up to.

Conclusion

In any difficult and uncertain situation, it is tempting to believe that we could do better if we had more information and better contextual understanding. The problem is that decision-makers often cannot know beforehand (and much of the time even afterwards) what the likes of Saddam Hussein or others were “really” thinking. Reasoning about other people’s reasoning also certainly carries some nontrivial risk. Once analysts begin to get into beliefs, beliefs about beliefs, and beliefs about beliefs about beliefs, they may find themselves losing touch of their own grounding in reality. To return to the Othello example, if all of the world is a stage, the complexity of reasoning about it should not be underestimated.

Given the complexity of many social situations, one wonders how people are able to avoid Hamlet-like indecision as they ponder the implications of implications of implications. This complexity is also why we ought to be very cautious about the idea that the problem with American strategy is that it is not textured and layered enough. Rather, the crux of the problem often lies in difficult choices about the right point to stop adding relevant factors and when to keep looking for useful additional factors to consider when trying to analyze the enemy. Men should be what they seem, but much of the time they aren’t. So how do we figure out what they are?

What more human intelligence, historical context, and cultural expertise can do is keep us focused on what matters. If all of the world is a stage, cultural and regional expertise can help us identify the important players, plots, and subplots. That won’t necessarily make our guesses and conjectures about what the enemy is thinking any less flawed or crude. But it can help discipline them and make them more useful to policymakers.

In sum, we may never be able to read the minds of our adversaries. However, strategy nonetheless demands that we try.

Adam Elkus is a PhD student in Computational Social Science at George Mason University and a columnist at War on the Rocks. He has published articles on defense, international security, and technology at CTOVision, The Atlantic, the West Point Combating Terrorism Center’s Sentinel, and Foreign Policy.

No comments: