Pages

16 April 2016

FBI Paid Hackers to Crack Apple’s iPhone Encryption System Used by San Bernardino Terrorist, Report

FBI paid professional hackers one-time fee to crack San Bernardino iPhone
Ellen Nakashima
Washington Post, April 13, 2016
The FBI cracked a San Bernardino terrorist’s phone with the help of professional hackers who discovered and brought to the bureau at least one previously unknown software flaw, according to people familiar with the matter.
The new information was then used to create a piece of hardware that helped the FBI to crack the iPhone’s four-digit personal identification number without triggering a security feature that would have erased all the data, the individuals said.
The researchers, who typically keep a low profile, specialize in hunting for vulnerabilities in software and then in some cases selling them to the U.S. government. They were paid a one-time flat fee for the solution.
Cracking the four-digit PIN, which the FBI had estimated would take 26 minutes, was not the hard part for the bureau. The challenge from the beginning was disabling a feature on the phone that wipes data stored on the device after 10 incorrect tries at guessing the code. A second feature also steadily increases the time allowed between attempts.

The bureau in this case did not need the services of the Israeli firm Cellebrite, as some earlier reports had suggested, people familiar with the matter said.
The U.S. government now has to weigh whether to disclose the flaws to Apple, a decision that probably will be made by a White House-led group.
The people who helped the U.S. government come from the sometimes shadowy world of hackers and security researchers who profit from finding flaws in companies’ software or systems.
Some hackers, known as “white hats,” disclose the vulnerabilities to the firms responsible for the software or to the public so they can be fixed and are generally regarded as ethical. Others, called “black hats,” use the information to hack networks and steal people’s personal information.
At least one of the people who helped the FBI in the San Bernardino case falls into a third category, often considered ethically murky: researchers who sell flaws — for instance, to governments or to companies that make surveillance tools.

This last group, dubbed “gray hats,” can be controversial. Critics say they might be helping governments spy on their own citizens. Their tools, however, might also be used to track terrorists or hack an adversary spying on the United States. These researchers do not disclose the flaws to the companies responsible for the software, as the exploits’ value depends on the software remaining vulnerable.
In the case of the San Bernardino iPhone, the solution brought to the bureau has limited shelf life.


FBI Director James B. Comey has said that the solution works only on iPhone 5Cs running the iOS 9 operating system — what he calls a “narrow slice” of phones.

Apple said last week that it would not sue the government to gain access to the solution.

Still, many security and privacy experts have been calling on the government to disclose the vulnerability data to Apple so that the firm can patch it.

If the government shares data on the flaws with Apple, “they’re going to fix it and then we’re back where we started from,” Comey said last week in a discussion at Ohio’s Kenyon College. Nonetheless, he said Monday in Miami, “we’re considering whether to make that disclosure or not.”

The White House has established a process in which federal officials weigh whether to disclose any security vulnerabilities they find. It could be weeks before the FBI’s case is reviewed, officials said. The policy calls for a flaw to be submitted to the process for consideration if it is “newly discovered and not publicly known.”

“When we discover these vulnerabilities, there’s a very strong bias towards disclosure,” White House cybersecurity coordinator Michael Daniel said in an October 2014 interview, speaking generally and not about the Apple case. “That’s for a good reason. If you had to pick the economy and the government that is most dependent on a digital infrastructure, that would be the United States.”

But, he added, “we do have an intelligence and national security mission that we have to carry out. That is a factor that we weigh in making our decisions.”

The decision-makers, which include senior officials from the Justice Department, FBI, National Security Agency, CIA, State Department and Department of Homeland Security, consider how widely used the software in question is. They also look at the utility of the flaw that has been discovered. Can it be used to track members of a terrorist group, to prevent a cyberattack, to identify a nuclear weapons proliferator? Is there another way to obtain the information?

In the case of the phone used by the San Bernardino terrorist, “you could make the justification on both national security and on law enforcement grounds because of the potential use by terrorists and other national security concerns,” said a senior administration official, speaking on the condition of anonymity because of the matter’s sensitivity.

A decision also can be made to disclose the flaw — just not right away. An agency might say it needs the vulnerability for only a few months.

“A decision to withhold a vulnerability is not a forever decision,” Daniel said in the earlier interview. “We require periodic reviews. So if the conditions change, if what was originally a true [undiscovered flaw] suddenly becomes identified, we can make the decision to disclose it at that point.”

No comments:

Post a Comment