Pages

23 April 2017

Psyched Out: Using Narrative Power to Exploit Cognitive Flaws by Jon Herrmann

Jon Herrmann

Somewhere, likely in many nations, adversaries are training to target the psychological vulnerabilities of American leaders. This article offers a possible structure that adversaries could use to exploit cognitive flaws and American cultural effects on decision-making. This article is intended to offer a lens through which to view information, with an eye to better understanding potential effects of information and information operations (IO) in traditional conflict, unconventional warfare, and civil-military operations. Readers are welcome to use this outline to evaluate if current events support adversaries using something like this model to promote decisions that are bad for American interests, but good for America’s adversaries.

One caveat- generalizability- requires mention. This article’s applicability rests in its focus on process, so the rubric can be used in many situations. This work follows the advice of a respected international relations expert- “If the focus remains on the similarities of process, rather than outcome, and sufficient attention is paid to circumstances that make the relevant conditions more similar than different, generalizability becomes possible.”[1]

The intent to exploit cognitive flaws to manipulate leaders’ decisions is not new. What may be new is the power of old narratives based on new neuroscience. Advances in neuroscience allow psychologists to examine the brain at work. Modern innovation is the increasing acceptance of biologically-based cognitive flaws outside psychology. Rational-choice theorists in many fields have been slow to accept the importance of cognitive flaws. Even today, many studies of international relations assume that a nation is a rational, unitary actor. But what if decisions are made by leaders, rather than nations? (Foreign leaders have been targeted by IO,[2] which implies that American leaders are also targets.) Would IO campaigns target U.S. leaders? And would understanding cognitive flaws empower adversary those campaigns?

Exploiting cognitive flaws by using a loss aversion/cognitive ease (LACE) model may empower operations that enable strategic goals. A basic definition of a strategic narrative is “a means for political actors to construct a shared meaning of international politics and to shape the perceptions, beliefs, and behaviour of domestic and international actors.”[3] (Emphasis added) Whether an action is defensive or offensive is often a matter of perspective. Persuading a potential adversary to see an action as defensive, and therefore remain passive, is an example of shared meaning created by a narrative. Examples from Vietnam to Somalia demonstrate that adversaries don’t need to defeat the U.S. in the field. To win without fighting, a combatant targets an adversary’s will (to paraphrase Sun Tzu). “War is messy and at its heart is about deterring, and, ultimately, destroying one’s opponent’s political will and ability to wage warfare – directly (militarily) or indirectly (politically, legally, economically, or in terms of information).”[4]

Below is a hypothetical step-by-step model adversaries might be using against U.S. leaders+

Step 1: Believe in the Power of Information

1a: Americans are historically reluctant to intervene in foreign affairs.

A narrative may dissuade American leaders from intervention in foreign conflicts. Americans may be reluctant to intervene overseas conflict,[5] based in part on a tradition of nonintervention. That reluctance bolsters a common feeling of ambivalence. An ambivalent person often makes the easy choice, particularly if the decision is seen as not important. Completely well-informed, rational decisions are rare. The likelihood of an attentive decision is lower still when such a decision requires significant effort. Rather than expend effort, most people follow an existing process without deep consideration. This gives an adversary seeking to dissuade American leaders from military intervention an immediate edge.

1b: Americans, like most people, hold opposing views on most issues.

Further, people typically hold opposing views on most subjects. Military intervention is a complex subject, easy to support under some conditions and oppose under others. Zaller and Feldman’s “ambivalence axiom”[6] “… means that most people hold opposing views on most issues, and these views can push them to decide a particular issue in either way.” Ambivalence sets the conditions for a narrative to sway a leader.

That’s especially true if a leader doesn’t see an issue as critical (as defined by the leader). Krosnick notes that attitudes about salient issues are steadier over time.[7] This implies that leaders are harder to persuade on core issues than on peripheral issues. If a leader sees intervention as peripheral, then the leader is likely more flexible on intervention.

1c: How an issue is presented can be powerful

Framing- the way one presents an issue- makes a difference that an IO campaign can exploit. Different presentation of a question can change decisions. In a classic example, cancer patients whose doctors offered treatment options changed their decisions based on small alterations in treatment descriptions. Treatments described by survival rates were more often accepted than the same treatments described by mortality rates, even with identical facts. Framing changed life-or-death decisions.[8] An IO campaign to dissuade leaders from military intervention could use framing by highlighting potential deaths in a conflict. This is particularly powerful in light of loss aversion (see below).

Framing is strongest when an event starts to receive notice. The greater effectiveness of early framing benefits aggressors, because the aggressor knows an event is coming. The aggressor can therefore ready a persuasive frame for early release. “In pre-conflict scenarios, the potential impact is highest through influencing perceptions, informing decision-making, and mobilizing internal and external support. (…) In this time-frame many audiences are impressionable… and have not yet seriously started to question attribution and content in any major way.”[9]

To make the most of the power of information, one simple set of tools might focus on the LACE model: Loss Aversion, Cognitive Ease.

Step 2: Make the Wrong Choice Easy for American Leaders (Cognitive Ease)

2a: Leaders- like everyone else- usually seek the easiest acceptable answer, not the best answer.

Nobel laureate psychologist Daniel Kahneman noted that “A general ‘law of least effort’ applies to cognitive as well as physical exertion. The law asserts that if there are several ways of achieving the same goal, people will eventually gravitate to the least demanding course of action.”[10] Leaders, under time pressure and frequently exhausted by weighty decisions, seem susceptible to this law. This is especially true when leaders are hungry[11], and long hours without meals are all too common for senior leaders.

2b: A strong narrative can present a poor choice as an easy, acceptable answer.

A strong narrative could exploit these flaws. Strong narratives seem cognitively easy; rational choices are not easy. Rational choice theories present decisions as a sequential process of gathering information, rating options, and deciding. A variant on rational choice, bounded rationality (or “satisficing”), is a similar process, with less information and fewer options.[12] “…Simon’s familiar notion of satisficing, or bounded rationality, where a person is likely to stop searching for additional information or choices once she has found an acceptable option.”[13] People give decisions with greater consequences more effort, and presumably make better decisions.

More recent work on intuitive decision-making, particularly under stress, demonstrates that decision makers don’t typically act rationally. Instead, they consider options sequentially and select the first workable option- not the rationally best option.[14] Since considering military intervention stressful, the sequential model seems more likely. A narrative may exploit intuitive decision-making, especially when waiting appears acceptable.

2c: Inaction is easier than action.

Actions that might create regret generate emotional uneasiness that people unconsciously seek to avoid. Framing experiments show that people tend to regret something they think of as an action more than inaction.[15] Therefore, the bias towards waiting increases the power of a narrative promoting inaction.

2d: Following Standard Operating Procedures (SOP’s) is easier than critical thinking.

One common cognitive flaw is a tendency to rely on standard operating procedures (SOP’s), such as an SOP to wait for clarity. This may be operationally (or politically) sound thinking. “Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions- and to extreme reluctance to take risks.”[16] A narrative could exploit this tendency by creating a damned-if-you-do scenario. On one hand, a rapid intervention is politically risky and may be a flawed decision, riddled with biases and heuristics. On the other hand, taking too long to decide expands the “Observe-Orient-Decide-Act” cycle, or “OODA loop”[17] and may present the leader with a fait accompli. “In the absence of reliable information, a state or organisation cannot act in an appropriate manner, leaving it behind the curve (or with a slower OODA loop).”[18] Following an SOP offers the alluring combination of potentially wise practice, political protection, and cognitive ease.

2e: Repetition creates cognitive ease in more than just SOP’s.

Another reason SOP’s are attractive is the power of repetition. We tend to reuse decisions that have worked in the past. Any information heard repeatedly can influence, especially if its source is forgotten. “If you cannot remember the source of a statement, and have no way to relate it to other things you know, you have no option but to go with the sense of cognitive ease.”[19] The feeling that we’ve heard something before generates cognitive ease- hence the propaganda dictum that any lie, told often enough, becomes the truth.[20] Cognitive ease from hearing what we’ve heard before leads us to accept it more readily. Put another way, “A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.”[21]

2f: When other issues distract leaders, easy answers are even more readily accepted.

The human mind can be overtaxed, with few items filling working memory.[22] Kahneman notes that overwork creates a form of option-blindness, stating that “Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention.”[23] In one example, titled “The Invisible Gorilla,” psychologists Chabris and Simons demonstrated that mentally overtaxed observers were, in effect, literally blind to obvious phenomena because they were focused on something else.[24] Similarly, if leaders are engaged thinking about another problem when a decision arises, they are more likely to take the cognitively easy answer, especially if that answer is low-risk, such as inaction.

2h. Leaders decide based on what they remember (Availability Heuristic; accessibility axiom)

When asked to make judgments, people often try to remember similar past situations. An adversary seeking to shape policy might plan to make key memories more readily available. If successful, decisions would be more likely based on the adversary’s preferred memories. “(T)he accessibility axiom states that accessibility (what is most available to people in their minds at the time they are asked) depends on what the person has most recently thought about.”[25]

For example, if an adversary wants to deter American intervention, then the adversary’s messages may revisit failed prior interventions, such as Vietnam or Mogadishu. Highlighting failed interventions can make those memories more available, decreasing the cognitive effort of comparisons to bad interventions, while increasing the effort required to think of analogies to the Gulf War or the Munich Conference.[26]

2i. People find it easier to remember negative things when their mood is negative.

Repetition makes ideas more memorable. Even indirect references may subconsciously affect decisions.[27] By adding other factors to repetition, an adversary can make concepts even more memorable. Memory is tied to emotion, such that happy memories are easier to recall when we’re happy, and memories of fear are easier to recall when we’re fearful. By using this effect (mood-congruent memory[28]), an adversary can use a memory that triggers loss aversion- e.g., Vietnam- to generate anxiety. Anxiety then makes other memories from anxious situations more accessible. This “availability cascade” can be further reinforced by “echo chamber” or “groupthink” effects of associating mainly with like-minded people holding similar opinions (in person or online).

Step 3: Make the Wrong Choice Look Wise for American leaders (Loss Aversion)

Taking the information operation a step further, an adversary may seek to make the adversary’s preferred option seem both easy and wise. One way to do so is to use loss aversion to make intervention appear to be a poor choice. Loss aversion is the tendency for people to strongly prefer avoiding losses over acquiring equivalent gains.[29] Using this principle, a credible threat of harm has an edge. To balance that threat, rewards would have to be approximately twice as desirable for equal motivation- a difficult standard for intervention.

3a: Inaction seems more acceptable when positioned between more extreme options.

An adversary can also present more extreme options to frame inaction as reasonable. Tversky demonstrated that most people possess an aversion to extreme options.[30] By creating more extreme options, an adviser can guide a leader towards a middle option that would have previously appeared unacceptable without even more extreme additional options.[31] If an adversary suggests that the U.S. support the adversary action, American leaders can more readily accept inaction as the “middle ground” between support for and opposition to adversary action.

A narrative might include a vague threat of significant casualties through conventional war, extended unconventional war, or even a nuclear exchange. Even military staffers know this tactic. Pillar notes, “There is a time-honored technique, familiar to veterans of policymaking in the U.S. Government, for ostensibly giving the boss a choice of options but in effect pre-cooking the decision. That is to present three options, which can be aligned along a continuum of cost or risk or whatever and the list the middle one as the one that the option-preparers wants to have chosen. Often this option is indeed chosen; as presented, it appears to be the most balanced and reasonable one, avoiding excesses of the alternatives on either side. But the appearance is an artifact of how the issue and the choices are framed. The whole framework may be skewed…. Similar dynamics apply not just to manipulation of options papers but also to public debate about foreign policy…. A commonly felt sense of what is extreme and what is reasonable may derive mostly from the framing….”[32]

3b: Vivid presentation of retaliation as likely and severe, yet unpredictable in its specifics, can make the negative extreme option more credible and psychologically effective.

Three added steps can make the extreme option even less acceptable- certain retaliation, uncertainty, and vivid examples.

Presenting the appearance of a fanatical adversary population makes retaliation seem very likely, supporting the extreme option. (Many adversaries present some adherents as so fanatical that adversary leaders must struggle to prevent them from targeting the U.S. This perception allows adversaries to appear helpful while doing nothing, claiming that they are doing their best to keep fanatics from attacking.) Deterrence is made effective by the surety and severity of punishment. An adversary can create the appearance of a group seeking justification to attack America (physically, electronically, economically, or informationally). If this appearance is persuasive, the adversary bolsters deterrence by making retaliation seem certain. Social media increases this capability. Leaders now rely on social media to gather information, especially in dangerous areas with little traditional media. Social media manipulation can create the credible appearance of a large, fanatical group to generate fear of certain retaliation.

A variant of this tactic seeks the best of both worlds from an adversary’s perspective. To maximize loss aversion, an adversary wants U.S. leaders to envision sure, brutal retaliation for any U.S. intervention. But to minimize possible gains (worsening the cost-benefits analysis from the U.S. perspective), the adversary also wants to make their action seem irrelevant to most Americans. One tactic might be to present risks (when possible) through channels observed by American leaders considering intervention. If a story makes headlines, then American leaders may see it as important, warranting intervention. If it is not in headlines, but still seen by key leaders in vivid detail, then the adversary can both minimize the intervention incentive from the public, and raise the expected danger to the leadership. An adversary might communicate through lower-readership channels like the comments section of professional military or foreign affairs journals, for example. Combining this tactic with repetition can create a “conventional wisdom” that an adversary’s action is of little interest to America, but intervention would be extremely hazardous.

To make the extreme option even more powerful, an adversary can make it more vivid. Clear, vivid images and stories are more compelling than dry statistics. Cognitive ease plays into this aspect, as well. It’s easy to envision a vivid image, so the threat of death accompanied by a graphic video of a beheading, or reference to a brutal long-term ground war or nuclear exchange, generates a picture that is difficult to overcome. Statistics, like probabilities of success in briefings or spreadsheets, are comparatively weak. “(C)ertain events may be simply easier to imagine than other events, and yet this ease of imaginability may have little to do with actual frequency. Events that have already occurred, for example, are much easier to imagine than events that have never occurred. For this reason, it remains easier to plan for the last terrorist attack, for example, than the next one. Similarly, it is easier to imagine what a conventional war would look like than to imagine the full-scale catastrophe of a nuclear war.”[33]

People are generally uncomfortable with uncertainty, so a vague threat that creates uncertainty may be more effective than a clearer threat. A clear threat can be analyzed, while a vague threat is harder to analyze. Further, leaders (seeking cognitive ease) will be reluctant to try to grapple with a vague threat. They may unconsciously avoid the problem of analyzing the threat by avoiding the action that would precipitate the threatened response.

Leaders already distracted with many other problems requiring mental effort are at risk of believing this threatening scenario. “(When critical thinking) is otherwise engaged, we will believe almost anything.”[34] Distraction can also come in the form of legitimacy questions, such as “What gives the U.S. intervention authority, anyway?” Distraction may also manifest as too many options, particularly if several draw support on social media, creating the appearance of a confusing groundswell for a variety of (often mutually-exclusive) policy choices.[35] In the face of such confusion, the adversary presents a reasonable narrative, easy to believe and risky to disbelieve. Disbelief also requires effort, which is contrary to cognitive ease. The easy, safe decision therefore is typically inaction.

Step 4: Make the Wrong Choice the Lasting or Repeated Choice

Once the initial decision is made against intervention, it becomes difficult to intervene in similar future situations for several reasons. First, keep in mind that the previously noted factors are likely still in play, if not more potent: 

Taken together, these are significant, but once the initial decision has been made, added factors appear: difficulty explaining change, difficulty accepting change when contrary to existing beliefs, and cognitive rigidity under stress, making any change more difficult.

The challenge of explaining a change, if the decision to intervene is under consideration. Leaders seeking cognitive ease would be hard pressed to justify why this situation merits intervention when prior, similar situations did not. Expending the mental effort (and political capital) to justify this change of position is likely not palatable unless the new situation is obviously critical.

Therefore, a leader may be motivated to not see a good reason to intervene (because intervention would be difficult and dangerous). Further, the leader would be supported by an existing coalition that has also already chosen not to intervene. When there is little motivation to examine the case for intervention, it’s unlikely leaders would consider that case. Kahneman notes that even intelligent people fail at basic intelligence-related tests when they lack motivation.[36] Put more succinctly, “It’s very difficult to make a man understand something when his livelihood depends on him not understanding it.”[37] The inaction norm is reinforced if “everyone” knows the risks. “We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.”[38] If one considers political leaders to be single-minded seekers of reelection,[39] then their focus is on winning the next election. If intervention might endanger their political survival, it would be unlikely that political leaders would support intervention even if military leaders advocated for it.

Beyond motivated reasoning, there remains the more basic fact that most people do not re-examine their prior decision, goals, or the likelihood of achieving those goals. Rather, they continue prior polices and plans. “(L)eaders can make omissions in surveying objectives. Especially in situations where priorities have long been established, decision makers do not always reevaluate their goals, or what the likelihood is of obtaining them, before planning a response.”[40] This is particularly true if leaders established beliefs about prior events as a category, and view new events through the lens of pre-existing beliefs. “(L)eaders often engage in selective processing of information at hand. Decision makers tend to give more weight to information that confirms their preexisting beliefs and attitudes[41]… psychologists showed that new information does not necessarily force individuals to revise their opinions about topics that inspire high degrees of emotional commitment.”[42]

Finally, any change can challenge individuals under stress. “Stress in and of itself leads to a particular kind of cognitive rigidity wherein discrepant new information tends to be rejected, thus restricting creative problem solving…. Because a person’s attention span is narrowed under conditions of stress, an individual under stress cannot focus for long on one thing and often has difficulty considering more than one option at a time. As time horizons become foreshortened, the long-term implications of a crisis might not be sufficiently considered or taken into account”[43] A senior leader faces difficulty in changing prior decisions, especially those that counter existing beliefs, and which might require justification.

Conclusion

America’s adversaries want to win, but know that they cannot oppose American forces directly. Indirect strategies, such as persuading U.S. leaders not to intervene, are the new weapon of choice. Concepts like the gray zone and hybrid warfare amply demonstrate the use of IO to influence nonintervention. America’s history and culture can dissuade intervention, but adversaries may also leverage cognitive flaws to create persuasive narratives within strategic IO campaigns. Two key cognitive flaws fall under the LACE model: Loss Aversion and Cognitive Ease.

The LACE model includes capitalizing on ambivalence by framing an event as unimportant to American national interest, particularly compared to other current events. Framing intervention as difficult to analyze and inaction as easy and acceptable (or even wise) undermines the will to act. An adversary can further increase the influence of their narrative by leveraging American SOP’s, repetition, and distraction. The narrative is still further supported by the vivid presentation of certain, brutal opposition to intervention with no clear limit to its duration or type. Such vivid stories also link to memorable instances of past interventions gone bad for greater effect. Adversaries can even craft more extreme options to frame inaction as wise in comparison, particularly if prior inaction caused no apparent harm and changing policy would be costly. Taken together, an adversary IO campaign that used a strong narrative to exploit these cognitive flaws could be a serious challenge to American will to intervene in a foreign conflict.

The means to counter an adversary effort to exploit cognitive flaws are unclear. Would persuasion in support of national will that did not exploit these flaws be sufficient to counter adversary IO targeting these flaws? If not, what practical issues (to say nothing of ethical issues) might be involved in implementing equivalent informational tactics? A host of questions require answers, and this short work can only offer a small fraction of those answers. Whatever work is to be done categorizing (and potentially countering) such information campaigns, understanding a basic structure like the LACE model may be a good place to start.

The views expressed are those of the author and do not reflect the official policy or position of the Department of the Army, Department of the Air Force, Department of Defense, or the United States Government.

End Notes

[1] McDermott, Rose. 2004. Political Psychology in International Relations. University of Michigan: Ann Arbor

[2] Nissen, Thomas Elkjer. 2015. #The Weaponization of Social Media: @Characteristics of Contemporary Conflicts. Copenhagen:Royal Danish Defense College. p. 67, describes the targeting of Georgian president Saakhashvili.

[3] Miskimmon, A., et al. “Forging the World: Strategic Narratives and International Relations,” Oct 2011, p. 3, newpolcom.rhul.ac.uk/npcu-blog/2012/1/17/strategic-narratives-working-paper-published.html

[4] Nissen, p. 33

[5] For an example, see the survey research of Trevor Thrall for the CATO Institute regarding American preference s for a non-interventionist foreign policy.

[6] Zaller, J. and S. Feldman. 1992. A Simple Theory of Survey Response: Answering Questions versus Revealing Preferences. American Journal of Political Science 36 (3): 579-616.

[7] Krosnick, J. 1988. Attitude Importance and Attitude Change. Journal of Experimental Social Psychology 24:240-55.

[8] McNeil, B., S. Pauker, H. Sox, and A. Tversky. 1982. On the Elicitation of Preferences for Alternative Therapies. New England Journal of Medicine 306:1259-62.

[9] Nissen, pp. 100-101

[10] Kahneman, D. 2011. Thinking Fast and Slow. Farrar, Strauss, and Giroux. 2011. p. 35

[11] For more on how hunger can erode willpower and promote poor decisions, see Baumeister, Roy F. et al., “Ego Depletion: Is the Active Self a Limited Resource?”, Journal of Personality and Social Psychology, 1998, Vol. 74, No. 5, 1252-1265; Danziger, S., Levav, J. & Avnaim-Pesso, L. Proc. Natl Acad. Sci. USA (2011) and similar research

[12] Simon, H. 1982. Models of Bounded Rationality. Cambridge: MIT Press. p. 51. See also March, J. and H. Simon. 1958. Organizations. New York: Wiley.

[13] Simon, p. 51

[14] Klein, Gary. 1999. Sources of Power. Cambridge; MIT Press; See also Klein’s The Power of Intuition. NYC: Doubleday. 2004.

[15] Kahneman, p. 348: When gamblers playing blackjack were asked “Do you want to stand?” versus “Do you want to hit?,” their regret was consistently greater if they responded “Yes,” since that was seen as taking an action. “Regardless of the question, saying yes was associated with much more regret than saying no if the outcome was bad.”

[16] Kahneman. p. 204.

[17] The OODA loop (Observe, Orient, Decide, and Act) is a psychological concept adapted to military application by Col. John Boyd. Having a large OODA loop relative to an adversary typically means reacting too slowly to be effective in combat. See Boyd, John R. Destruction and Creation; U.S. Army Command and General Staff College.

(September 3, 1976) for further information.

[18] Nissen, p. 55

[19] Kahneman, p. 62

[20] Variously attributed to Williams James, Josef Goebbels, Adolf Hitler, and V.I. Lenin.

[21] Kahneman, p. 62

[22] Commonly accepted to be 6-8 items per Miller, G.A. (March 1956). "The magical number seven plus or minus two: some limits on our capacity for processing information". Psychological Review. 63 (2): 81–97, but may actually be as low as four, as demonstrated by Cowan, Nelson (2001). "The magical number 4 in short-term memory: A reconsideration of mental storage capacity". Behavioral and Brain Sciences. 24: 87–185.

[23] Kahneman, p. 23

[24] Chabris and Simpson, The Invisible Gorilla: How Our Intuitions Deceive Us; Random House, New York, 2009.

[25] McDermott, p. 39
[26] For more on the power of reasoning by analogy as a rationale for leaders decisions on war, see Steve Yetiv’s book, Explaining Foreign Policy: U.S. Decision-Making and the Persian Gulf War, JHU Press.

[27] Use of related ideas to prompt desired thoughts can be referred to as “priming.” For extensive examples of priming, see Meyer, D.E.; Schvaneveldt, R.W. (1971). "Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations". Journal of Experimental Psychology. 90: 227–234; Vaidya, Chandan L.; Monti, Laura A.; Gabrieli, John D.E.; Tinklenburg, Jared R.; Yesevage, Jerome A. (1999). "Dissociation between two forms of conceptual priming in Alzheimer's disease" Neuropsychology. 13 (4): 516–24; Forster, Kenneth I.; Davis, Chris (1984). "Repetition Priming and Frequency Attenuation". Journal of Experimental Psychology: Learning, Memory and Cognition.10 (4); and Matsukawa, Junko; Snodgrass, Joan Gay; Doniger, Glen M. (2005). "Conceptual versus perceptual priming in incomplete picture identification". Journal of Psycholinguistic Research. 34 (6).

[28] See McDermott, p. 157, for more on mood-congruent memory

[29] Tversky, Amos, and Daniel Kahneman. “Judgment under Uncertainty: Heuristics and Biases.” Science 185 (1974): 1124-31, and Tversky, Amos, and Daniel Kahneman. “The Framing of Decisions and the Psychology of Choice.” Science 211. (1981): 453-58.

[30] Simonton, I. and A. Tversky 1992. “Trade-off Contrast and Extremeness Aversion.” Journal of Marketing Research 29:281-95

[31] Simonton and Tversky.

[32] Pillar, Paul R. “Military Force and the Fallacy of the Middle Way, ” National Interest Online, 27 Oct 2016, p. 1, nationalinterest.org/blog/paul-pillar/military-force-the-fallacy-the-middle-way-18191

[33] McDermott, p. 65. For more on vivid threats of nuclear war, see Plous, S. 1989. Thinking the Unthinkable: The Effects of Anchoring on Likelihood Estimates of Nuclear War. Journal of Applied Social Psychology 19:670-91.

[34] Kahneman, p. 81

[35] See Nissen, p. 86, for how “trolls” can generate options for distraction purposes.

[36] Kahneman, pp. 45-46 (Note the experiments of Kahneman himself and Shane Frederick)

[37] (Who said that?)

[38] Kahneman, p. 217

[39] See Mayhew, David, The Electoral Connection, Yale University Press, 1974 and Buena de Mesquita, Bruce, The Dictator’s Handbook, Public affairs, New York, 2011.

[40] McDermott, p. 120

[41] Lord, C., L. Ross, and M. Lepper. 1979. Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence. Journal of Personality and Social Psychology 37:2098-2109. Nisbett, R. and L. Ross. 1980. Human Inference: Strategies and Shortcomings of Social Judgment. Englewood Cliffs, NJ: Prentice-Hall.

[42] McDermott, p. 123

[43] Zimbardo, P. and R. Gerrig. 1996. Psychology and Life. 14th Ed. NYC:HarperCollins College Publishers.

No comments:

Post a Comment