Pages

12 January 2014

2014 as the Year of Encryption: A (Very) Brief History of Encryption Policy


By James Andrew Lewis
JAN 10, 2014

Some people are calling 2014 the “Year of Encryption” as the tool that will let companies and individual protect their data. Perhaps this time it will be true.

The United States once controlled the export of encryption as if it was a weapon. This was done to protect NSA. When encryption's use was limited to a small number of banks and governments, this restriction made sense. It was also easier to control encryption when it was a piece of hardware (like the Enigma machine) than software that could run on any computer.

The internet changed this. Computers had been safe because they were not connected. Locking the door to the room where the computer was kept made data secure. The people who build computers or wrote software did not think about security. With the internet, all of these unprotected and unsecurable machines became connected. Denying physical access no longer protected data because remote access over the internet was possible.

NSA, now much maligned, saw the problem in the early 1990s. NSA has two missions. The first is to secure the data of U.S. agencies and companies from foreign espionage. The second is to preserve the capability to collect intelligence from foreign entities. U.S. policy was to find a solution that made the internet more secure and preserved NSA capabilities. NSA developed an elegant (although expensive) solution: the Clipper Chip.

Clipper was a hardware solution. If the chip was installed on a new computer, it would encrypt the data and only allow it to be read by authorized persons who had the “key” to the encryption. One problem with Clipper is that it would have added hundreds of dollars to the price of every computer or cell phone. Another problem was that NSA knew the secret of Clipper and, with court authority, would be able to read any encrypted data.

(My first meeting at the White House on Clipper was in 1992. All agencies present, including Commerce, supported Clipper. Since I knew a little about programming and economics, I thought to myself, this will never work. I went back and told my boss, Dick Clark, that Clipper was impracticable. Dick took on “decontrolling” encryption and pushed to give people access to encryption software. This led the then-Director of Central Intelligence to go to the President and plead, successfully, for him to deny Dick’s proposals.)


An expensive product that protected you from everyone but NSA, unsurprisingly, had little market appeal. Clipper was not the answer. The new Clinton administration, eager to see wide adoption of the internet, had created two White house groups to oversee policy: an Electronic Commerce Working Group chaired by Ira Magaziner (among its legacies is the creation of ICANN) and a Secure Public Networks Working Group, co-chaired by OMB and NSC staff. This working group’s task was to find a way to secure the internet without damaging either law enforcement or intelligence equities. This task occupied us for four years. The solution, we came to believe, lay in the widespread adoption of encryption.


By this time it was clear that encryption would be software rather than hardware, that it would be commercially produced, and with the boom in computers for business use, there could be a mass market for encryption products. The Secure Public Networks group focused on Public Key Infrastructure (PKI) as a solution. PKI provides a secure way for people to exchange crypto keys so that only authorized individual can look at data, when it is stored on a computer or travelling between two computers. But PKI had many problems.

First, FBI and NSA still wanted copies of the keys to be stored somewhere that they could access under a court order. We searched for a “Trusted Third Party” to hold the keys, but found that no company wanted its most sensitive data accessible by a third party. There is no such thing as a “Trusted Third Party” for truly valuable data.

Second, we found that commercially available encryption products were woefully deficient. In 1995, NSA reviewed forty best selling encryption products. None of them were without flaw. Some companies claimed to use “56-bit keys” (key length is an indicator of encryption’s strength) but either by design or through error, many programs only used 28-bit keys, which even the FBI could break. Random number generators are a crucial component of encryption programs, but they are hard to create. Some companies used defective random number generators. Others gave up entirely and just used a list of numbers from which their program would randomly select. An inadequate random number generator makes encryption easier to break, and if these commercial products did not challenge NSA, they would also not challenge leading foreign intelligence agencies.

Encryption programs are hard to write and hard to make work. Software has gotten better every year since then, but there are still bugs. If the U.S. had tried to hold software to some standard of excellence, it would have killed the market, just as if it had tried to hold the cars of the 1920s to the automobile safety standards we have today: no car would have even been built. Eventually these design flaws will disappear, but not anytime soon.

The secure internet problem was so acute that it was (in Washington parlance) escalated to the Deputies Committee (DC) of the National Security Council, subcabinet officials who, in that administration, met only on the most pressing problems. The DC was chaired by the President’s Chief of Staff, who was willing take on hard decisions for the President. The Secure Public Networks working group, then co-chaired by a very experienced OMB official and a brilliant NSA officer on detail to the NSC, supported the DC, teeing up issues and providing information.

And encryption and internet security required unavoidably hard decisions. If the U.S. let its companies sell encryption, NSA would lose major collection capabilities (this had happened recently after another change in technology and had taken time and money to recover). FBI could lose the ability to wiretap criminal communications (CALEA - Communications Assistance to Law Enforcement Act and what eventually became the Patriot Act, measures designed to preserve wiretap capabilities in the face of technological change, set the context for Law Enforcement). Technology changes for commercial reasons; government operations must keep up. The difficulty of this is routinely overstated, but signals intelligence and wiretap exists in a dynamic technological environment that must be matched by equally dynamic efforts to maintain intercept and collection capabilities.

The source of dynamism in this debate was the largely unexpected appeal of computers and the internet. The pace of adoption exceeded expectations. Demand was insatiable. Few new users knew or cared about the vulnerabilities. Those that did know were at risk because they could not buy strong American products.

(A major U.S. aerospace company called me in 1997 to say that they needed encryption for their foreign subsidiaries and since U.S. companies couldn’t sell it, they had found a website in the Netherlands offering a freeware version of PGP (freeware was much more common and legitimate in the 1990s – this was the Netscape era). PGP is a good product, but the foreign intelligence agency that had put this freeware on the web made sure that they could read what it had encrypted. Making encryption available created risks for national security, but denying it also created risk. There were also commercial risks that threatened U.S. software companies in the booming information technology market.)

The politics of the DC found Commerce pushing for freer access to encryption products, NSA and FBI opposed. Justice, Defense, and the White House, initially inclined to the security side, but eventually became neutral. The DC was the sense of some very tense debates. In one, NSA’s Deputy Director said that there would be bodies in ditches if we decontrolled encryption. This was not hyperbole. We may like to pretend that issues are clear cut but serious policymaking requires choosing among imperfect options and seeking policies or programs to mitigate risk.

The complexity and newness of the issue disrupted the normally staid hierarchy of the interagency process. The DCI, a cabinet member, appeared at a Working Group meeting in 1996 to say that we should release encryption. This was the moment that the gridlock broke. A flabbergasted FBI representative pointed out that this would damage intelligence collection. The DCI responded “that’s my problem, I’ll take care of it.”

(In 1996, an NSA Deputy Director and I occasionally had to accompany this DCI to meetings with Ira Magaziner on internet policy. He would carry a book on elliptic curve cryptography under his arm. At the time, we thought this was a bit showy, but the recent revelations about NSA suggest otherwise. Some programs are years in the making)

A similar departure for protocol occurred when the Attorney General (AG) also a cabinet member, showed up for several DC meetings to argue for making encryption widely available. The interagency politics of encryption policy had been that NSA and FBI argued against release, Commerce argued for. DOD, Justice, and the White House began by being sympathetic to the status quo but gradually became neutral. When the DCI, the AG, and DOD changed positions, the White House decided to end the encryption debate.

The reasons for the change in policy were straightforward. The Deputies weighed the risk to intelligence collection against the risk to protecting American companies and consumers. They weighed the commercial risk to U.S. companies in a dynamic market where the U.S. did not have a monopoly on encryption. The future would see (we all thought) the widespread use of encryption on the internet whether we liked it or not, so holding back our companies would not produce any security benefit. The Deputies talked about back doors and rejected them. The most important reason for this was that it was certain that major foreign opponents would find and exploit any back door. There was also a desire to avoid tainting U.S. products. The DCI’s belief that he could maintain collection gave the Deputies confidence the U.S. could overcome the collection problem without restrictions or backdoors.

This was in many ways an ideal outcome. The vocal opponents of encryption restriction thought they had won. Companies could sell without government interference. And the U.S. was still able to collect what it needed. Where the policy failed was in securing the net.

The Clinton Administration created internet policy, from ICANN and e-Commerce law to PDD-63 on critical infrastructure. We are still working off the policies of the 1990s, but encryption is the least successful part of internet policy. We assumed that if we released encryption from control, people would use it and the internet would be secure. We were wrong.

Most people didn’t use encryption when it became available. We did not expect this. They underestimated risk. Maybe the current revelations will change that. Encryption was “kludgey,” slowing computers and annoying users. Perhaps as processor speeds have improved, this is no longer a problem. Encryption was hard to implement, but if it becomes a “cloud” service requiring no user input, this may not be a problem. We learned that ease of use is more important than assessment of risk in determining whether people will use encryption, but major changes in how people use the internet may have given encryption a second chance.

There will still be problems. Software will still have flaws and implementation will be uneven. People make mistakes. This creates openings that a well-resourced attacker can exploit, but it will be harder and more difficult to get in. Greater use of encryption will also run afoul of the business models that drive internet commerce. If you compare how much privacy users had in 1996 and how much they have now, privacy has been “engineered out” of the internet (what privacy advocate were doing while this happened is a mystery). It could be ‘engineered” back in and companies now seem willing to do this. The move to the mobile internet creates opportunities for exploitation. When you download an app onto a mobile device, do you know who wrote it or what it is sending back (or to whom)? A free app that also steals encryption keys is in the realm of the possible.

This is not the complete story. It does not talk about the battles over authentication of identity (another policy problem still not rectified) or how NSA dealt with the problems of the internet (Snowden’s leaks give a distorted picture). It points out, however, where policy went wrong at the start of the internet age. Encryption is not a silver bullet, it is not unbeatable, but greater use would make data more secure.

The U.S. now faces similar decisions on how to balance intelligence and national security with privacy and trust. It’s likely that the decision this Administration makes will be the same, that the security of American companies and citizens is more important than the cost to intelligence, and new solutions to the collection problem will have to be found. It would be nice, fourteen years later, if encryption policy actually worked and the internet became more secure. Maybe this time it will happen.

James Andrew Lewis is a senior fellow and director of the Strategic Technologies Program at the Center for Strategic and International Studies in Washington, D.C.

No comments:

Post a Comment