Beatrice Heuser
In security policy- and strategy-making, as in all decision-making, decisions can be tainted by false assumptions about an adversary’s (ir)rationality or (il)logic, and equally false delusions about our own rationality and logic. Beatrice Heuser reflects on the biases and pitfalls of public policymaking, and offers some considerations for positive change.
All policymaking follows–or should follow–an initial insightful and comprehensive analysis of a situation that requires decisions to be made. These are often marred by false assumptions about a situation, causality or the reasoning of other actors. Even the best analysts cannot help but perceive a situation, an action or events as they unfold through the prism of their own assumptions and biases. In the context of security policy- and strategy-making, these often include false assumptions about an adversary’s (ir)rationality or (il)logic. We also tend to see ourselves as completely rational and logical. Yet our own irrational biases also get in the way of good analysis, and then of good decision-making. While some biases may not be noxious, what we are considering in this essay are biases preventing good foreign- and security-policy analysis by skewing our view of what is happening and/or what options are available to react appropriately.Footnote1 Especially when it comes to identifying what is happening in another government, numerous biases keep us from seeing clearly what is going on, leaving aside what that government tries to hide. Taking cues from psychologists, there is a list of such noxious biases that can be found in public policymaking.
Chickens, Turkeys and Swans
Picturesquely, some biases have been illustrated by references to animals. One may be the unwarranted optimism that something that has not gone wrong for many years will continue to not go wrong in the future. Analysts may suffer from excessive optimism, like the new settlers in that fertile slope that turns out to be the side of a volcano or the green valley that is in fact the flood plain of a river prone to burst its banks in a wet year. Disaster may or may not strike within analysts’ time in a particular job, but in other contexts, such a wager will be even more dangerous. British philosopher Bertrand Russell warned against the fallacy that a good pattern will hold forever by invoking a chicken that believes its farmer-owner to be benevolent as he has fed it regularly over the months, a metaphor translated by Nassim Nicholas Taleb into that of a trusting turkey.Footnote2 But the day of slaughter will come, and in some circumstances, its timing can be predicted–in the case of the Thanksgiving or Christmas slaughter of poultry, with considerable accuracy.
No comments:
Post a Comment