Apple, the FBI and the ghost of the Clipper chip

01.03.2016
Invading a user’s privacy was a bad idea when it was proposed in the 1990s via the ill-conceived Clipper chip, and it’s a bad idea now, no matter what name it’s given this time.

Back in the days of the Clinton administration, the government wanted to force electronics manufacturers to install a chip that would allow the government (with a court order, supposedly) to have unfettered access to any and all encrypted data on the device. It proposed to do this via a scheme called “key escrow,” wherein the cryptographic keys to the kingdom were stored by a presumably trusted escrow service and only turned over to the government upon lawful request in the form of a subpoena. After a long battle, the Clipper chip program was shot down, largely because the public saw it as too intrusive.

But since then, law enforcement, including top officials at the FBI, have never stopped moaning about the strength of modern cryptography. On current iOS hardware, for example, information is encrypted using the Advanced Encryption Standard (AES) with 256-bit keys. The 256-bit keys used to encrypt the file system and files are derived from, among other things, the user’s passcode, which is protected in the device’s secure enclave hardware via the user’s fingerprint. In short, on a modern iOS device that is locked with a strong passcode, there is no back door than to have access to the keys themselves.

Today, as the Justice Department and the FBI seek Apple’s cooperation in unlocking an iPhone used by one of the San Bernardino terrorists, the specter of the Clipper chip seems to be haunting us. (Note: In the San Bernardino case, the terrorist used an older iPhone and, as I understand it, a four-digit PIN, so that situation does not apply to a more modern and general case that I describe above.) Meanwhile, the past two decades have seen immense changes in the tech landscape.

Modern end-user devices go well beyond desktops and laptops to include smartphones and tablets. All of these devices typically can be remotely managed. For example, in Apple’s iOS world, a mobile device management (MDM) system can oversee a fleet of iPhones and iPads.

MDMs do this by installing a security profile onto each device they manage. Think of a security profile as a list of security policies and settings (for example, the user must use a strong password that is at least N characters long). What makes this all work is a hierarchical chain of digital signatures. In essence, the policies are stored in a tamper-evident container, and the iOS device will not circumvent the policies without the consent of the device owner, or the MDM owner, if that administrative privilege has been delegated to the MDM.

MDMs have become pretty commonplace in companies. Even users who bring their own mobile devices to work usually must consent to having their devices managed by the company. The company owns the network, so the company sets the security policies. We all get that, rightI am talking about MDMs because a Clipper chip-style program in the 2010s would, in essence, turn over our electronic devices to a central government-run MDM. Not an MDM in the sense that the government would manage our security settings, but an MDM in the sense that it would have access to that tamper-evident seal and would be able to peek inside of any device under its purview.

I’m no lawyer, but as a privacy-minded individual, my mind screams “unreasonable search” when I hear that sort of description. The Clipper chip idea was rightfully rebuffed by the tech community in the 1990s, and anything similar to it should be rebuffed now.

Why Two reasons. For starters, the potential for misuse is simply too great. I have little faith that the system won’t be abused by the presumably good guys when it suits them. And what happens if the escrow service is itself compromised by the bad guys Oh, but it’ll be properly protected, you say You mean like the Office of Personnel Management (OPM) protected the background-check data for every government employee and contractor That data had to have been considered highly sensitive as well, right So, we’re to trust the same folks who failed to protect OPM’s sensitive data with all the keys to every device in the country I don’t think so.

Second, I have to side with the U.S. Constitution here, in the form of its Fourth Amendment, which reads, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

Again, I’m no lawyer, but I have to believe that any scheme similar to the Clipper chip program would violate the right of the people to be secure in their effects — their electronic effects, that is.

Although the model I’ve described here isn’t what the Justice Department is specifically demanding regarding the San Bernardino terrorist case, there have been increasing calls for cryptographic back doors of late, and I’m a slippery-slope kind of guy. I figure that the government will seize upon any opening to expand its ability to access encrypted communications.

Before any of us should be compelled to turn over our keys, a warrant should be issued upon probable cause. I simply don’t see that happening in the Clipper chip or any other proposal I’ve heard to date from the U.S. government.

With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.

(www.computerworld.com)

By Kenneth van Wyk

Zur Startseite