Pages

29 February 2016

Apple is fighting the wrong encryption case

February 25 

Protesters demonstrate outside an Apple Store as they object to the US Government's attempt to put a backdoor to hack into the Apple iPhone, in Los Angeles, California on February 23, 2016. (Mark Ralston/AFP/Getty Images)

Apple chief executive Tim Cook is such a respected figure that it’s easy to overlook the basic problem with his argument about encryption: Cook is asserting that a private company and the interests of its customers should prevail over the public’s interest as expressed by our courts. 

The San Bernardino, Calif., encryption case was the wrong one to fight. Apple doubled-down Thursday by asking a federal court to vacate its order that the company create a tool to unlock the iPhone of shooter Syed Rizwan Farook. But if a higher court ultimately requires Apple to do that, as seems likely, the company will be seen by privacy advocates at home and abroad as having been rolled by the U.S. government. Foreign governments may demand similar treatment. Neither outcome is in Apple’s interest. 

David Ignatius writes a twice-a-week foreign affairs column and contributes to the 

Cook’s stand has added to his luster as a tech hero. But the case, unfortunately, could be a lose-lose for U.S. technology companies, eroding both privacy protections and global market share. 

Cook’s Feb. 16 “Message to Our Customers” was somewhere between a civics lesson and a sales pitch. “Smartphones, led by iPhone, have become an essential part of our lives,” Cook began. He said he wanted to protect Apple users from what he said was an FBI “backdoor” that “would undermine the very freedoms and liberty our government is meant to protect.” 

The Justice Department reacted indignantly. “Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data,” the government argued in a Feb. 19 motion. FBI DirectorJames B. Comey also blasted Apple, arguing that “corporations that sell stuff for a living” should not be allowed to set the balance between privacy and safety. 

Apple vs. the FBI: Here’s what you need to know

On Feb. 25, Apple filed a motion opposing a court order to help the FBI unlock the iPhone belonging to a San Bernardino shooter. Here's everything you need to know about Apple vs. the FBI. (Jhaan Elker/The Washington Post) 

Comey told the House Intelligence Committee on Thursday: “This is the hardest question I’ve ever seen in government.” 

Cook’s insistence that the FBI is seeking what could become a universal “backdoor” raises several interesting questions. First, Cook has argued that if Apple creates a special tool for Farook’s phone, it could be used by governments or hackers to crack any other iPhone. But Stewart Baker, a former National Security Agency lawyer and a leading writer on encryption issues, cites an Apple security paper that suggests Apple has plenty of ways to prevent the tool from being used without its permission or on phones other than Farook’s. 

“Apple’s new security architecture requires that any Apple update, including one written for law enforcement, must be uniquely tied to each individual phone that gets it,” Baker said in an email. “The phone can’t download an update unless it’s been digitally signed by Apple and then ‘locked’ to an individual phone.” 

Apple has also argued that if it unlocks Farook’s phone for the FBI, it might have to provide similar help whenever it gets a legal order from foreign governments, including repressive ones in Russia or China. But it’s not clear why this would be so. 

The access tool presumably would be kept in the United States. Any foreign government that wanted to use it could make a request through a mutual legal assistance treaty, which allows a U.S. judge to review a case and make sure the use fits proper legal standards. Critics say that the process is underfunded, slow and cumbersome. So spend more, speed it up — and keep proper judicial control. 

An intriguing, little-noted aspect of Apple’s argument is that the U.S. government shouldn’t make such a fuss about “going dark” with encrypted iPhones because it has so many other useful surveillance techniques. Here, I suspect Apple’s supporters are right. U.S. intelligence agencies have indeed devised new ways to analyze “big data” and find patterns that defeat the clever use of smartphone and email encryption by terrorist groups such as the Islamic State. 

The best rebuttal of Comey’s “going dark” argument came in a report issued this month by Harvard’s Berkman Center for Internet and Society, titled “Don’t Panic.” It notes that in a world of smart appliances and clouds of unencrypted data, people leave so much digital exhaust that it’s still possible for intelligence agencies to find and track dangerous targets. 

What’s the “net assessment” of the costs and benefits for U.S. national security in this debate? It seems clear that U.S. interests are served by a world in which there is pervasive use of iPhone-style encrypted smartphones that embody American values of privacy and free exchange of information. 

The Apple legal brawl could undermine this dominance. Comey’s broadsides against “going dark” may make consumers suspicious about U.S. technology. So, unfortunately, do Cook’s false claims about a “backdoor.”

No comments:

Post a Comment