8 September 2017

Banning Encryption to Stop Terrorists: A Worse than Futile Exercise

By Aaron Brantly

Should the US and other nations ban or undermine encrypted online communication tools in response to the use of such technology by terrorists? According to Aaron Brantly, calls for such a move may often follow attacks, but it would be a mistake. For instance, he suggests a ban 1) would not prevent the ability of terrorist actors to gain access to such tools; and 2) would create vulnerabilities in online applications, exposing broader society to increased security risks.

Terrorist groups are increasingly using encryption to plan and coordinate terrorist acts, leading to calls for the banning or backdooring of encrypted messaging apps. This would be misguided. It would not remove the ability of terrorist groups to access this technology and may push them to communicate in ways that are more difficult for Western intelligence agencies to monitor. By creating vulnerabilities in online tools used by a very large number of Americans and other users around the world, it would also expose the broader society to increased security risks.

Calls for backdoors to be placed within popular peer-to-peer messaging applications that use end-to-end encryption, or for these apps to be banned, have become almost a ritual in the wake of terrorist attacks in the West.1 Although there is substantial evidence terrorists are increasingly using encryption in their communications in planning, providing instruction for, and coordinating attacks, these calls are misplaced and would likely make communications intercepts more difficult except in the very short term.

The debate over encryption and terrorism is complex and multifaceted. Yet at its core, the argument revolves around a decentralized and evolving marketplace that has grown substantially over the last 10 years. Prior to 1992, cryptographic technology was subject to the U.S. Department of State’s Munitions List within the International Traffic in Arms Regulations (ITAR). In 1992 and through the remainder of the 1990s, controls on encryption were increasingly reduced and moved from ITAR to the Commerce Department’s Export Control List.2 This shift was pivotal to the expansion of the internet because it made possible secure online commerce and the protection of communications and data. To understand the relationship between encryption and terrorism and the flawed logic of backdoors and banned services requires understanding the need for encryption in everyday life as well as the challenges that would be faced through the restriction of its implementation.

Throughout most of U.S. signals intelligence history, from World War II to the present, the study, development, and code-breaking of encrypted communications was conducted under the highest levels of security.3 The sale or transfer of information related to the development of cryptography was heavily restricted under ITAR, and its move to the Department of Commerce was largely spurred by a confluence of events. Most notably, in 1992 Congress signed the Scientific and Advanced-Technology Act,4which allowed for public and commercial networks to connect with one another. This act and the increasing rise of commercial internet traffic necessitated the implementation of basic encryption to secure communications and commerce.5 Initial cryptographic implementations were differentiated within the Netscape browser between foreign software distributions and domestic distributions. The intent was to separate and limit the spread of strong cryptography to foreign actors. These two tiers of cryptographic strength established a double standard that hindered international commerce and communications and soon fell by the wayside. Yet, these initial expansions of cryptography into the broader internet led to immediate backlash within law enforcement and intelligence communities.

In testimony before the Senate Judiciary Committee on July 9, 1997, FBI Director Louis J. Freeh lamented:

“The looming spectre of the widespread use of robust, virtually uncrackable encryption is one of the most difficult problems confronting law enforcement as the next century approaches. At stake are some of our most valuable and reliable investigative techniques, and the public safety of our citizens. We believe that unless a balanced approach to encryption is adopted that includes a viable key management infrastructure, the ability of law enforcement to investigate and sometimes prevent the most serious crimes and terrorism will be severely impaired. Our national security will also be jeopardized.”6

So real was the fear associated with encryption that in 1993 the National Security Agency developed the Clipper Chip, a small computer chip that could backdoor into telecommunications on user devices to provide readily accessible decryption with a warrant.7 The prototype was never put into operation because a Bell Labs researcher, Matt Blaze, published a paper demonstrating fundamental flaws in the tool that would have resulted in significant security vulnerabilities allowing for the interception of all traffic and data on the devices by third parties.8 This period, known as the first crypto wars, served as a precursor for today’s often truculent debates on the merits and pitfalls of encryption in modern society.

The terrorist attack by Khalid Masood in London in March 2017 and revelations that he used the popular messaging application WhatsApp prior to the attack saw calls within the United Kingdom and the United States for “no hiding places for terrorists.”9 These were renewed after the bombing attack by Salman Abedi in Manchester in May 2017 and the attack on London Bridge and Borough market in June 2017 after which British Prime Minister Theresa May stated, “We need to work with allied democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremist and terrorism planning.”10 In August 2017, British Home Secretary Amber Rudd called on companies to provide what, in effect, would be backdoors in messaging apps. “We want [technology companies] to work more closely with us on end-to-end encryption, so that where there is particular need, where there is targeted need, under warrant, they share more information with us so that we can access it,” she said.11 a

There is no doubt that the wide commercial availability of end-to-end encryption has provided terrorist groups with a powerful tool to plan and coordinate attacks in real time. It allowed the Paris and Brussels cell, for example, to enter into extensive contact with Islamic State operatives in Syria during the planning phases of their attacks. That being said, there has been a tendency to scapegoat encryption after attacks in which encrypted messaging apps were not used as far as is known in attack planning (for example, the San Bernardino plot) or not used uniformly by plotters in the lead-up to an attack (the Paris attacks).b In fact, there is no significant evidence that any of the attacks that have been perpetrated against the U.S. homeland would have been prevented had encryption been weakened while there were opportunities to identify the cells behind some European attack without encryption messaging backdoors.

Even though end-to-end encryption is proving to be a powerful new tool for terrorist groups, calls to either weaken or provide backdoors into encryption rely on assumptions that both the market for encrypted messaging applications is closed and that weakened encryption is in the best national security interest of the nation. Both arguments are erroneous. To understand why providing backdoors into encrypted messaging applications would fail to stop terrorists requires breaking apart these two assumptions.

The Communications Applications Market

The marketplace for messaging applications is diverse.c A basic Google search of messaging applications reveals more than 180 different applications spread across dozens of platforms. Thirty-one of these applications have general public licenses (GPL), meaning they are free, often non-copyrighted, and enabled with permissions to copy and/or modify the code.

Put another way, the code behind end-to-end encryption is (and will likely always be) available to terrorist groups. Robert Graham previously discussed the difficulties of preventing innovation and adaptation of source code by terrorist organizations in this publication; his insights remain pertinent and problematic for those who seek to regulate encryption.12 In a 2014 report on the state of al-Qa`ida encryption, one group of tech analysts noted at minimum six iterations of encrypted communications platforms developed by al-Qa`ida and its affiliates.13 The development of ad hoc communications platforms or the co-optation of code from other platforms remains an ever-present problem and one that is likely to challenge efforts to implement backdoors or weaken encryption in any given platform.

It should also be noted that many of the messaging applications, which feature end-to-end encryption, are not developed in the United States and would therefore not be subject to U.S. laws. Although the United States could legally force Google Play and the App Stored to prevent the downloading of applications that do not conform to U.S. law, this is easily circumvented on phones using Google’s Android system. This is because programs (such as, for example, the Berlin-headquartered Telegram) can be downloaded onto Android phones directly from the website of the app developers through APKse due to the fact that Android’s entire concept is based on publicly sharing code to create compatibility. This can be done via VPNf or proxy networks,g making it difficult to prevent.

The open nature of the operating systems on which those platforms function complicates regulating the development of backdoors into encrypted messaging programs. Open platforms such as Android are largely non-excludable, meaning they allow users to download material from all websites using compatible code, and any law or policy that restricts or weakens encryption in installed applications could be circumvented through the creation of new programs outside of U.S. jurisdictions. U.S. regulators could, in theory, try to shut down rogue websites offering backdoor-less, end-to-end encryption, but this would likely result in a fruitless game of whack-a-mole across the dark web. Even a shift to force Google to provide backdoors within Android (the base operating system) or to restrict the open nature of its Android platform would not solve the problem as the code for Android is already available publicly and modifiable.14 In other words, users could just download a non-approved version of Android operating system to run on their phones.

Generating a backdoor into today’s leading chat application should not therefore be seen as a long-term solution. Rather, all evidence to the contrary suggests that terrorists will quickly shift to new platforms or innovate and create their own.

While backdoors would provide only a very short-term gain in combating terrorism, there would be significant long-term costs for U.S. companies. Google prides itself on providing a readily modifiable software platform that allows for users and companies to customize their experience. If this customizable experience were curtailed, it would close off portions of Google’s market. Mandating laws or policies on encryption would likely cause a shift in application and operating system markets by individuals who value their privacy as well as a shift by terrorists and others away from those products believed to be penetrated by intelligence and law enforcement agencies. This adaptation to market conditions is not hypothetical but is borne out in discussions within Islamic State and al-Qa`ida chatrooms and forums.15

A market shift away from U.S. companies would result in billions in lost revenues and undermine many of the core technical communities at the heart of the modern digital economy. Central to the digital economy is trust, and revelations about U.S. espionage severely degrade trust in American companies online.16 Financial damage done to U.S. firms is also not hypothetical as demonstrated by the losses suffered by American firms following the Edward Snowden leaks.17 These losses are doubly damaging as they open the door to foreign competitors to step in where U.S. firms are unable to compete due to regulation or revelations of close coordination with the U.S. government. They also potentially expose users to foreign products that might lack the same privacy and security mechanisms commonly included in U.S. products.18

A more effective approach than creating backdoors would be to intensify current efforts to intercept terrorist communications. Some encryption tools offering end-to-end encryption can be cracked through a variety of means, including supercomputers. More practically, the targeted interception of communications of specific subjects of interest has been demonstrated in numerous instances against multiple types of actors who use robust encryption and digital operational security. Western intelligence agencies have the most sophisticated techniques available, but other actors have also proved able to extract information from the devices of individuals using encryption—including recently in Syria by the pro-Assad regime Group5, which used Android malware, PowerPoint files, and other file types to target opposition groups.19 While the use of end-to-end encryption denies access to intelligence and law enforcement agencies who seek to passively collect large scale communications in transit, the use of targeted intelligence through the introduction of malware and the exploitation of software vulnerabilities and other weaknesses in both the hardware and software of devices likely leaves ample room for the collection of data on the endpoints of communications, which can be helpful in preventing terrorist attacks.20 In other words, there are powerful tools available to intelligence services to remotely extract messages from smart phones after a particular messaging app (for example, WhatsApp) has decrypted them.

Another argument for not moving toward backdoors is that by fostering close relationships with developer communities and monitoring within the context of acceptable legal parameters, intelligence agencies and law enforcement can keep terrorist adversaries within a more readily monitorable ecosystem.

Innovative techniques to access the content of encrypted messaging apps have been demonstrated by the German Federal Criminal Office (BKA) in instances of countering neo-Nazi groups and jihadi terrorists through the use of custom software tools and the creative manipulation of device registration and login security credentials.21

Even in cases where it is not possible or practical to access the content of conversations, the relevant data associated with those conversations is likely to be accessible. This relevant data includes initial registration information, such as phone numbers, email addresses, or usernames, and metadata associated with the transit of data from source to destination through a subpoenable party within U.S. jurisdictions.22 While these clues might not provide the time and place or plan of attack, they can help a vigilant intelligence and law-enforcement community develop patterns of behavior and network analysis that ultimately may achieve many of the same goals of backdooring encryption without the side effects of platform flight and market disruptions to U.S. firms.23

Lastly, there is a false assumption embedded within calls for backdoors and weakened encryption that intelligence agencies and law enforcement will suddenly have instant access to that data. If there were only a few targets of interest, this would likely be true. Following a reasonable legal process of seeking out a warrant, law enforcement could keep surveillance on potential terrorists. Yet, the reality is that the volume of potential information collected far exceeds the ability of intelligence agencies and law enforcement to meaningfully track each individual case to the level of detail required to listen to every communication or message transmitted. Additionally, U.S. law limits what the intelligence community, including the NSA, can collect overseas with the constraints centering on “valid targets” relevant to national security.h

It should also be noted that even with encryption backdoors, terrorist attacks will still get through because not all terrorists use encryption to communicate their plans. There is no evidence the Boston bombers, the San Bernardino attackers, or the Orlando shooter had any communications with any third-party actors or terrorist groups overseas that would have alerted law enforcement to specific plans to attack.

Breaking Encryption and National Security

Jihadi terrorism is a significant threat, yet it is a threat that must be contextualized within the entire scope of the security needs of the nation. Code, or the logical constructs within which encryption is implemented, is deliberate, and it functions to achieve software goals and objectives. Even without backdoors being mandated by governments, the robust implementation of encryption into messaging applications or other programs is difficult and often fails.24 And even the best encryption poorly implemented is insufficient to secure communications in digital networks.

From a technical perspective, the addition and manipulation of code that would be necessary to create backdoors undermines security.25 Programs often contain superfluous code that seemingly has no purpose, yet it is in these areas of inexactitude that malicious parties target their attention. The better a program is written and the more it conforms to secure development lifecycles, the less likely it will be to contain weak spots.26

Encryption facilitates a range of social goods that are critical to the functioning of modern society.27 Encryption underpins the modern financial system and makes it possible to engage in secure commerce. It protects medical records,28 personal records,29 corporate secrets, and intellectual property.30 Encryption is critical to securing the communications of private citizens and businesses. Whether we recognize it or not, encryption is quite literally embedded in our daily lives.

Encryption provides several valuable functions that its deliberate manipulation would endanger.31 First, and perhaps least importantly, encryption provides confidentiality. The clear majority of ire directed against encryption implementations in applications centers on the ability of encryption to ensure the confidentiality of communications. Second, encryption ensures the integrity of data transmitted. Integrity is of limited concern with regard to popular messaging applications as a dropped line can still contain the context of conversations (the voice might come through slightly scrambled or a bit of text might be missing), but it is of immense value in financial and medical records or other data types that rely on accuracy. Third, encryption ensures authenticity. This is vital both in communications and in data transmission. The ability to authenticate data ensures its trustworthiness. Finally, encryption allows for non-repudiability. This essentially means that it establishes that the user of a given product is who they say they are and that the data they are transmitting is being transmitted by them and not a non-trusted party.

The backdooring of encryption undermines each of these core competency areas of encryption and potentially opens areas for legal challenges should a case ever be prosecuted. Beyond providing a space for potential legal challenges of intercepted communications, the deliberate weakening of encryption knowingly introduces vulnerabilities into systems that already struggle to establish security. First, a conversation that fails to maintain confidentiality might result in the loss of valuable intellectual property, medical information, or other forms of communications vital to personal, corporate, or other forms of security. Second, the invalidation of the integrity of communications might inadvertently lead to the manipulation of communications, causing individuals to make, for example, financial transactions or medical decisions based on information that is invalid and resulting in damaging outcomes. Third, authenticity ensures that individuals are less susceptible to the manipulation of identity within a communication chain. This prevents such common problems as identity theft. Lastly, and particularly important to financial transactions, is non-repudiability. It is immensely important know that the person requesting some action—for example, a stock trade—is the person asking for it and that they cannot say as some point in the future that their communications were manipulated.

The Core of the Terrorism and Encryption Debate

The debate over encryption has been presented as a tradeoff between a narrow national security imperative and the United States’ broader national security interests: Is the possible prevention of terrorist attacks against the homeland sufficient to deliberately undermine the code that underpins the national strength of the United States?

The framing of the debate in this way depends on the premise that backdoors would provide a security dividend when it comes to counterterrorism. This article has argued that in all but the very short term, the reverse may be true because terrorists may move onto parts of the internet that are more difficult to monitor.

But even if the premise of a tradeoff were accepted and even if the deliberate weakening of encryption were guaranteed to prevent all future terrorist incidents in the United States, the debate would still be worth having as it involves a tradeoff that includes balancing the needs of U.S. firms, individuals, and national security.

Those who argue there is a broader national security imperative for encryption include former NSA and CIA Director General (Ret.) Michael Hayden, who has stated “American security is better served with unbreakable end-to-end encryption than it would be served with one or another front door, backdoor, side door, however you want to describe it.”32

Americans expect their intelligence and law enforcement professionals to be perfect. To achieve perfection, the intelligence and law enforcement communities rightly request every capability available. As General (Ret) Hayden has noted, “the American people expect the CIA to use every inch we’re given to protect her fellow citizens.”33 In a public forum conversation between Hayden and Chris Soghoian on ethics at West Point in the spring of 2015, Hayden said “we will only go far as the American people allow us, but we will go all the way to that line.”34 Yet, the line on encryption is not black and white. To speak the lanuage of software engineers, providing the ability to access all encryption for one use case—preventing terrorist attacks—has consequences in virtually all others use cases. Nor does the ability to access communications imply that all terrorists will be stopped.

Although encryption is an easy punching bag in the wake of terrorist attacks, weakened encryption is not the panacea it is made out to be. There will continue to be challenges faced by intelligence and law enforcement professionals in stopping terrorists if encryption is weakened, and those challenges could become even greater if terrorists retreat from online ecosystems which are easier to monitor.

Terrorism remains a problem and a challenge to liberal democracy, but undermining the digital security of society without improving the capability of security services in a sustained way to detect terrorist activity is a worse than futile exercise.

Substantive Notes

a The attacks in San Bernardino, Orlando, Paris, and Brussels also spurred discussions on encryption. Russell Brandom, “How San Bernardino changes the FBI’s war on encryption,” Verge, March 29 2016; Pamela Engel, “The Orlando attack exposes the biggest blind spot in the US strategy against ISIS,” Business Insider, June 19, 2016; David Kravets, “FBI Is Asking Courts to Legalize Crypto Backdoors Because Congress Won’t,” Ars Technica, March 1, 2016; Michael Birnbaum, Souad Mekhennet, and Ellen Nakashima, “Paris attack planners used encrypted apps, investigators believe,” Washington Post, December 17, 2015.

b No evidence has come to light that encryption apps were used by the Orlando attacker, Omar Mateen, in planning his attack. Encrypted messaging applications were used by the cell behind the Paris and Brussels attacks while preparing for the attacks. However, during the 24 hours leading up to the Paris attack and during the attacks, several of the attackers communicated by cell phone calls and text messages. In total, 21 phone calls and two text messages were exchanged in the 24 hours before the Paris attack between the Samsung phone used by the Bataclan attackers and a cell phone geolocated in Belgium. Glyn Moody, “Paris terrorists used burner phones, not encryption, to evade detection,” Ars Technica, March 21, 2016; Paul Cruickshank, “Discarded laptop yields revelations on network behind Brussels, Paris attacks,” CNN, January 25, 2017; Paul Cruickshank, “The Inside Story of the Paris and Brussels Attack,” CNN, March 30, 2016; Scott Bronstein, Nicole Gaouette, Laura Koran, and Clarissa Ward, “ISIS planned for more operatives, targets during Paris attacks,” CNN, September 5, 2016.

c Although there are certain platforms (WhatsApp in particular) within the general population that maintain market dominance, this dominance is a fluctuating phenomenon. “Messaging apps are now bigger than social networks,” BI Intelligence, September 20, 2016.

d Google Play is the name of the applications store run by Google. App Store is the name of the applications store run by Apple.

e Android Package Kit (APK) is the package file format used by the Android operating system for distribution and installation of mobile apps and middleware.

f A virtual private network (VPN) extends a private network across a public network and enables users to send and receive data across shared or public networks as if their computing devices were directly connected to the private network.

g A proxy server is a server (a computer system or an application) that acts as an intermediary for requests from clients seeking resources from other servers.

h This was most clearly evidenced by “The NSA Report: Liberty and Security in a Changing World,” which found that while the United States had substantial collection capabilities, it utilized those capabilities within a rule of law structure as outlined broadly within Appendix B of the report. These constraints included specific procedures for the targeting and collection of both U.S. and foreign persons under FAA 702. Further constraints on NSA targeting and collections fell under Executive Order 12333. Richard A. Clarke, Michael J. Morell, Geoffrey R. Stone, Cass R. Sunstein, and Peter Swire, “The NSA Report: Liberty and Security in a Changing World.” The President’s Review Group On Intelligence and Communications Technologies. Princeton, NJ: Princeton University Press, 2014).

Citations

1 Teri Robinson, “Comey says encryption stymies law enforcement, calls for “hard conversation,” SC Magazine, March 08, 2017.

2 International Traffic in Arms Regulations, rev. April 1, 1992, available at epic.org.

3 Stephen Budiansky, Code Warriors NSA’s Codebreakers and the Secret Intelligence War Against the Soviet Union (New York: Knopf, 2017).

4 42 U.S.C. § 1862(g).

5 C. L. Evans, “U.S. Export Control of Encryption Software: Efforts to Protect National Security Threaten the U.S. Software Industry’s Ability to Compete in Foreign Markets” North Carolina Journal of International Law and Commercial Regulation 19:3 (1994): pp. 469–490.

6 Statement of Louis J. Freeh, Director of Federal Bureau of Investigation, “Encryption, Key Recovery, and Privacy Protection In the Information Age,” United States Senate, 105th Congress, 1997.

7 Stephen Levy, “Battle of the Clipper Chip,” New York Times, June 12, 1994.

8 Matt Blaze, “Protocol Failure in the Escrowed Encryption Standard,” in 2nM Conference on Computer and Communications Security, 1994.

9 Alan Travis, “Call for Encryption Ban Pits Rudd Against Industry and Colleagues,” Guardian, March 26 2017.

10 “Read Prime Minister Theresa May’s Full Speech on the London Bridge Attack,” Time, June 4, 2017.

11 Mark Bridge, “Amber Rudd demands ‘back door’ access to encrypted messaging apps,” Times (London), August 1, 2017.

12 Robert Graham, “How Terrorists Use Encryption,” CTC Sentinel 9:6 (2016).

13 “How Al-Qaeda Uses Encryption Post-Snowden (Part 1),” Recorded Future, May 8, 2014.

14 See https://source.android.com.

15 Aaron Brantly, “Innovation and Adaptation in Jihadist Digital Security,” Survival 59:1, Routledge, 2017, pp. 79–102.

16 Mary Madden, “Public Perceptions of Privacy and Security in the Post-Snowden Era,” Pew Research Center, November 12, 2014.

17 Clair Cain Miller, “Revelations of N.S.A. Spying Cost U.S. Tech Companies,” New York Times, March 21, 2014.

18 Sean Gallagher, “Chinese company installed secret backdoor on hundreds of thousands of phones,” Ars Technica, November 15, 2016.

19 “Iranian Actor ‘Group5’ Targeting Syrian Opposition,” Securityweek.com, August 4, 2016.

20 Brian Barrett, “The CIA Can‘t Crack Signal and WhatsApp Encryption, No Matter What You’ve Heard,” Wired, March 7, 2017.

21 Sebastian Lipp and Max Hoppenstedt, “Exklusiv: BKA-Mitarbeiter Verrรคt, Wie Staatshacker Illegal Telegram Knacken,” and “3,5 Grรผnde, warum der BKA-Hack gegen Telegram illegal ist,” Motherboard.Vice.com, December 8, 2016; Florian Flade, “Bei WhatsApp und Co. muss der Staat selbst zum Hacker werden,” Die Welt, June 19, 2017.

22 See 18 USC §2703 – Required disclosure of customer communications records.

23 Amit Sheth, Boanerges Aleman-Meza, I. Budak Arpinar, Chris Halaschek, Cartic Ramakrishnan, Clemens Bertram, Yashodhan Warke, David Avant, F. Sena Arpinar, Kemafor Anyanqu, and Krys Kochut, “Semantic Association Identification and Knowledge Discovery for National Security Applications,” Journal of Database Management 16:1 (2005): pp. 33–53.

24 See https://www.eff.org/node/82654.

25 Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Robert L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael Specter, and Daniel J. Weitzner, “Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications,” Computer Science and Artificial Intelligence Laboratory Technical Report, Massachusetts Institute of Technology, July 6, 2015.

26 “What is the Security Development Lifecycle?” Microsoft.

27 “Encryption: Five Major Benefits,” Kaspersky Lab, March 19, 2013.

28 “Meeting HIPAA Requirements with Federal Information Process Standard (FIPS) Encryption,” CISCO, 2012.

29 Nuala O’Connor, “Our personal security is our national security,” Hill, March 1, 2016.

30 C. Johnson and D.J. Walworth, “Protecting U.S. Intellectual Property Rights and the Challenges of Digital Privacy,” United States International Trade Commission, No. ID-05, 2003, pp. 1–32.

31 Linda Pesante, “Introduction to Information Security,” CERT, Software Engineering Institute, 2008, pp. 1–3.

32 Paul Szoldra, “Ex-NSA Chief Thinks the Government Is Dead Wrong in Asking Apple for a Backdoor,” Business Insider, February 25, 2016.

33 Michel V. Hayden, Playing to the Edge: American Intelligence in the Age of Terror (New York: Penguin Books, 2016), p. 120.

34 Author’s notes of Hayden Seghoian Debate: Privacy vs. Intelligence, United States Military Academy, April 20, 2015.

No comments: