Apple CEO Cook: Judge’s order to decrypt shooter’s iPhone is more than an encryption issue

18.02.2016
A court order for Apple to help the FBI carry out a brute-force attack on the iPhone used by shooters in last year’s San Bernardino terrorist attack would set a precedent with broad implications, experts say.

For one, it could mean that in the future makers of encryption products might have to modify them to meet similar orders if they can’t otherwise access the encrypted data they contain. This raises the concern of privacy advocates who say encryption is important not only to protect personal data but also to safeguard transactions and industrial secrets.

On the flip side, if the order is overturned, it could leave law enforcement without a tool it desperately wants until a federal law passes that clearly spells out that product manufacturers must be able to meet the demands of such orders. Given the contentious nature of federal politics this year, that could be a long process.

MORE: Why the FBI’s request to Apple will affect civil rights for a generation

While the order to Apple comes amid a debate about whether it’s wise to install backdoors in encryption devices, it doesn’t actually demand such a backdoor. Instead, it’s requiring Apple to disable the anti-brute-force mechanism that wipes the phone clean after 10 failed attempts to open it up with its passcode.

With that mechanism out of the way, the FBI could carry out an electronic attack trying random passcodes until it hits on the right one. That code, in combination with unique hardware characteristics of the phone, would generate the keys to decrypt the device. The phone would open and the data would be available. The overall security of the device would be undermined, but technically the integrity of the encryption itself would remain intact.

Bruce Schneier of Harvard’s Berkman Center for Internet and Society

“I’m not sure of what distinction you’re making,” says Bruce Schneier of Harvard’s Berkman Center for Internet and Society. “The FBI is asking Apple to reinstall a vulnerability they fixed.”

The phone in question is an iPhone 5c, which didn’t have the brute-force protection initially, but that was put in place by Apple in 2014, Schneier says. The question is, he says, “Is Apple obliged to make vulnerabilities accessible to the government”

If so, Schneier says, “It has implications for everybody, everybody’s devices – phones, computers, Fitbits, medical devices.”

Apple CEO Tim Cook says the company will challenge the order in court, in part because it will set a legal precedent with widespread adverse effects on privacy. “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge,” Cook writes in a statement to Apple customers.

In requesting the order, the government surmises what Apple can do to open up the phone to a brute-force attack. In his response, Cook doesn’t say Apple can’t do it. In fact, he implies that it could, but he just doesn’t think it’s a good idea, and that software to disable anti-brute-force for this particular iPhone 5c could be used against other phones.

“In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession,” he writes. “The government suggests this tool could only be used once, on one phone. But that’s simply not true.”

Tim Cook, Apple CEO

The Electronic Frontier Foundation sides with Cook and is even more suspicious that the tool once written would be applied widely. “Even if you trust the U.S. government,” the EFF writes, “once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well.”

The government isn’t asking for a master key, and Cook also seems to misstate what the order requires. “Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.”

+ WHITE HOUSE REACTS: White House: FBI is not asking Apple for a 'backdoor' to the iPhone +

But the order itself says the software it’s asking Apple to write “will not modify the iOS on the actual phone.” The software created to facilitate access to the phone would be loaded into RAM, according to the method suggested by the FBI.

Cook says the FBI is misleading in its description of what it wants. “The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” he writes.

If the government wins the case it will be unprecedented that a court forces a company to write new code to undermine the security of an existing product, says Edward McAndrew, a former Assistant U.S. Attorney, Cybercrime Coordinator and National Security Cyber Specialist now in private legal practice at Ballard Spahr LLP. “That’s where we’re on new ground,” McAndrew says. “I’ve never seen a court say you need to alter a device so the government can hack it.”

In the past similar orders have required companies to obtain and turn over data but in those cases it was readily available and didn’t require new code to enable cracking the device, he says.

McAndrew says the pros and cons of the encryption issue cut both ways even within the government. On the one hand, law enforcement wants access to encrypted data relevant to investigations, but on the other hand, the Department of Health and Human Services imposes penalties on medical facilities that don’t adequately encrypt personal health information.

In the case of law enforcement, McAndrew says he’s had personal experience where not having access to encrypted data has hurt investigations. And in the course of investigations, evidence sometimes crops up that heads off planned crimes that haven’t happened yet. This type of information could be on the San Bernardino iPhone, but there’s no way to tell without decrypting it.

The government has used iCloud and other backup and synching services in the past to get around the problem of encrypted devices, he says. The court papers in the San Bernardino case indicate the iPhone was backed up to the cloud, but apparently there was a time lag between the last synch and when the FBI got its hands on the phone.

The order tells Apple to provide “reasonable technical assistance” to do three things:

The FBI recommends how to do this: Write and turn over to the FBI software that loads into RAM and doesn’t modify the operating system or modify data or system partitions within the phone’s flash memory.

The order calls for the software to be loaded either at a government facility or an Apple facility. If it’s at Apple, the company has to provide remote access to the phone via a computer so the FBI can conduct passcode recovery analysis.

If Apple comes up with an alternative way to accomplish the same goals, it can do so as long as the FBI agrees to it.

(www.networkworld.com)

Tim Greene

Zur Startseite