Pages

10 September 2022

Ambiguous Ethical Situations and the Letter “A”

Daniel Pace

As a leader in the Special Forces, I frequently chew on how my organization makes ethical decisions, particularly when we are working in morally uncertain environments. What concerns me most is the potential disparity between what I think is ethical and what the folks that are executing my guidance think is ethical when I am not around. In my experience, units I have served in have tried to address this issue through large auditorium briefings from the JAG or Chaplain. Most of us on the ground-pounding side of the Army aren’t a very theoretical lot, so the briefings on Just War Theory or The Hague Convention frequently lead to dozing audiences, and the question and answer sessions at the end frequently end up with “you’ll know it when you see it” as the answer to the ever-present question: “how will I know if what I’m doing is immoral or illegal?” Unfortunately, the way I see it, the way the 15-6 officer sees it, and the way the guy that took the action rarely line up, resulting in undesirable consequences for everyone involved.

While chewing on this problem and thinking about how to improve moral agency in my unit, it occurred to me that at the unit level, the problem isn’t necessarily that I need to improve the quality of my troops’ moral education, but rather that I need to ensure we have a similar enough understanding of what moral and immoral decisions look like that I can trust them to execute on my behalf. For operational purposes, the disparity in our opinions is more important than the specifics of either of our interpretations.

We are probably never going to bring most of the folks around to reading Aristotle or Kant, but we can at least ensure the majority of the people in the unit are on the same page by building an analogy off our favorite ethical answer – “you’ll know it when you see it” and by making a few changes to the way we train.

Let’s start with the analogy. Let’s take the letter “A” depicted below in figure 1.

Aside from a few philosophers, nobody is going to debate that the above figure is an A. How do people know it’s an A? Well, they know it because they see it. They have an entire lifetime of cumulative experience that tells them this is an “A.” It’s something we can all agree on.

Now let us tweak the “A” just a bit like this

Is figure 4 still an “A”? If this figure were depicted alone (rather than in a series of “A”’s) would you still think it was an “A”? Sometimes our brains can trick us on these issues, and things that don’t look like “A” s at all can seem very “A” like if they’re surrounded by other, more well defined “A’s.” What if we applied the old team-room ethical standard to this “A”? If you saw this “A” standing tall on the cover of the New York Times, would it look like an “A” to you?

Let’s take one more figure:

Now certainly, figure 5 is not an “A” right? We can all agree that aside from any “A” ness it derives from its companions depicted in this article, this figure has no business being called the letter “A.”

Discussions on morality are frequently similar to the above line of reasoning. Getting a consensus on the extreme ends of moral decision making is often easy, and disagreements rarely arise about blatantly moral or immoral actions. The trick is ensuring everyone is on the same page with the decisions in the middle. Moral philosophers have been trying to solve that problem for thousands of years, so it is unlikely this paper will do so. Opinions and perspectives are too diverse, and people do not agree on whether there is an objective answer to the problem at all, much less what that answer might be.

It is, however, possible to reduce disparity in a smaller population – let’s say a military unit getting ready to deploy to combat – and reduction of that disparity will significantly improve the unit’s ability to operate effectively by building trust between commanders and subordinates, reducing the need for disruptive investigations, and reducing the likelihood of “catastrophic losses in the information domain” as an old boss called referred to them.

The key to reducing this disparity is in the incorporation of moral decision making into military training. We do a bit of it now, particularly during large-venue events like the ones provided by the national training centers, but by the time the unit is at JRTC/NTC/CTC, a serious disparity in ethical decision making is just going to lead to someone getting fired. Too many people are watching and too much is on the line professionally for people to behave as they would when nobody is looking.

Units need to incorporate moral scenarios much earlier and more often to generate the desired results. Training events need to be morally (and not just tactically) challenging to ensure Commanders know which of their subordinate leaders see the same “A”s they do, and which ones need a bit more attention to get them on the same page. JAGs – the de facto ethicists on most staffs since we have largely replaced “should I do this” with “can I legally do this” in the regiment - must be heavily incorporated as well to reduce the seams between the future investigators and the future investigated[1]. Finally, these events should be as free-play and risk-free as possible, to encourage people to make the decisions they will actually make, rather than the decisions they think the boss wants to see. Commanders can work through ethical disparity in subordinates, but they must be aware of it to do so effectively.

By taking the above steps, units can significantly reduce the operational risk presented by ethical disparity in the organization. Disciplinary action can be reduced, and illegal or immoral activities can be averted unintentionally. During their pre-deployment training, when some crusty Master Sergeant stands up and tells the new guy that “he’ll know it when he sees it” about some random ethical question, leaders can nod confidently with the knowledge that what he really means is “we’ll know it when we see it.”

No comments:

Post a Comment