FAQ: Everything we know so far about Apple's battle with the FBI

22.02.2016
At this writing, Apple’s battle with the FBI over how much it can and should help in the investigation of the San Bernardino shootings is less than a week old. But already it’s explosive to say the least. The government has accused Apple of being more concerned with marketing than the fight against terrorism, and Apple has drawn a line in the sand, saying that complying with the FBI’s request “would undermine the very freedoms and liberty our government is meant to protect.”

This fight isn’t going to be over anytime soon, so we’ll keep this FAQ updated as events unfold. If you have more questions—or want to respectfully debate the implications this case will have on privacy and security—please chime away in the comments and we’ll do our best to make everything about this confusing case as clear as possible.

The United States District Court for the Central District of California issued an order on February 16, giving Apple five business days to respond. Apple posted an open letter to customers on its website explaining its side of the case, prompting government attorneys to file a motion on February 19 disagreeing with Apple’s view of the situation, and asking the court to force Apple to comply.

A hearing is scheduled to take place in Riverside, CA, on March 22. Until then, the lawyers will file more motions, while the two sides also take their case to the court of public opinion. On Sunday February 21, FBI Director James Comey posted at Lawfire that we should "take a deep breath and stop saying the world is ending." Apple updated its open letter on Monday February 22 to add its own FAQ on privacy and security.

The iPhone 5c in question was used by San Bernardino shooter Syed Rizwan Farook, but it was his work phone, so it technically belonged to his employer, the San Bernardino County Department of Public Health. Farook also had a personal phone as well as a personal computer, but he physically destroyed both before the December 2 shooting. Farook was killed in a firefight with police.

In the course of its investigation, the FBI wants to examine the iPhone 5c for evidence. The DOJ’s court filing from Friday February 19 reads:

The San Bernardino County Department of Health did consent to the search, but the iPhone is locked with a passcode (reportedly a 4-digit pin, not something more complex), and apparently the county didn’t use good multi-device management practices, because they don’t know that passcode and couldn’t access anything on the phone without it. From the same February 19 court filing:

That wasn’t what Apple was asked to do—Apple actually has no way of unlocking a locked iPhone. Apple does have a way to extract data from a device running iOS 7 or earlier, without having to unlock the phone. Apple has done this before for law enforcement with a proper court order—another filing by the government estimates at least 70 times.

But starting with iOS 8, the data on an iPhone is encrypted by default as soon as you enable the passcode feature. Since Farook’s iPhone 5c is running iOS 9, the only way to access the encrypted data it holds is to unlock the phone with the passcode. Since the owner of the phone (Farook’s employer) doesn’t know the passcode, and Apple doesn’t know the passcode, and Farook is dead, the FBI is stuck trying to crack the passcode through brute force.

The best defense iOS has against a brute-force attack is the Erase Data feature, which will wipe all the data on the iPhone after 10 failed passcode attempts. The iPhone has a 4-digit pin, which shouldn’t take too long to crack, but certainly more than 10 tries.

So the FBI’s request, and the court’s February 16 order, is for Apple to create a sideloadable SIF (software image file) of iOS that can run on the iPhone’s RAM without touching any other data on the device. The FBI wants Apple to sign that software so the iPhone—and only this iPhone—will run it. Once installed, the software would disable that Erase Data setting.

The FBI also wants to try passcodes as quickly as possible, so it wants Apple to disable the delay between passcode attempts, plus allow passcodes to be inputted by a computer, either through the iPhone’s Lightning port or wirelessly, a feature that has never existed in a publicly shipping version of iOS. That’s a big deal—as Matthew Panzarino points out at TechCrunch, it’s asking Apple to introduce a new weakness into iOS.

It doesn’t seem like it—the FBI just doesn’t want to take any chances. From the February 19 filing, emphasis ours:

Apple posted an open letter to customers explaining its position. It reads in part:

That depends on whom you ask. For example, Bruce Schneier of Harvard’s Berkman Center for Internet and Society told our colleagues at NetworkWorld, “The FBI is asking Apple to reinstall a vulnerability they fixed.” He says the iPhone 5c didn’t intially have protection against brute-force attacks to guess the passcode, but those were added in 2014 with iOS 8.

The government’s February 19 court filing definitely disagrees that it’s a backdoor, mostly because the order is written just for this phone.

But Apple believes that it is—that “master key” quote is right from Apple’s open letter.

The DOJ is saying that the FBI only wants to do this once, that’s true. But the February 19 filing uses several other court cases as precedent to bolster its argument that Apple is being unreasonable to refuse this time. In both this San Bernardino investigation and a separate drug case in the state of New York, the government is saying that since Apple helped before, they should be willing to help again.

So it’s a little weird that the FBI wants us to believe that once Apple builds this tool to assist law enforcement to brute-force a passcode, that it wouldn’t be used again. Even if that particular software image file was never shared and promptly destroyed, the courts could use this case as precedent to order Apple to build it again.

The government claims that Apple can retain total control over the software, and even the device itself. Reads the February 19 filing, “the Order permits Apple to take possession of the subject device to load the programs in its own secure location, similar to what Apple has done for years for earlier operting systems, and permit the goverment to make its passcode attempts via remote access.”

But since Apple is being asked to create a tool for law enforcement to use, that tool would have to stand up to scrutiny if any evidence collected with it is ever used in court. Jonathan Zdziarski’s excellent blog post “Apple, FBI, and the Burden of Forensic Methodology” explains this really well. Zdziarski has extensive experience in iOS forensics, working with law enforcement and testifying as an expert in court.

He explains that tools used by law enforcement to collect evidence are legally known as “instruments,” and for evidence collected by such tools to be admissable in court, the court as well as the defense must have confidence that the tools are accurate and their results reproducable. New instruments—a breathalyzer, a speed-detecting radar gun, or a software tool like this one—have to be tested and validated by a third party like the NIST (National Institute of Standards and Technology) or NIJ (National Institute of Justice), and generally accepted by the scientific community. That’s why breathalyzer tests are admissable but polygraphs are not.

Zdziarski also explains how before iOS 8, when Apple could still extract unencrypted data from a locked device, this was seen as a lab service, not an instrument. In that case, Apple would have to demonstrate to the court (usually through expert testimony or an affadavit) that it had the expertise to run the test, but it could claim “trade secrets” to avoid detailing the exact methods. But when it’s law enforcement carrying out the method itself, the standard is different.

Now, just because evidence collected by use of this tool might not be admissable in court doesn’t make that evidence worthless. Law enforcement could learn something about Farook on his iPhone that they could then verify through other means that are admissable.

The February 19 filing lists the other methods the government and Apple discussed, and why they won’t work, in a footnote on page 18, paraphrased here:

But that third method (attempt an auto-backup to iCloud) is where it gets really weird. The iCloud password was reset remotely, shortly after the crime, by the owner, i.e. the county. The February 19 filing says, “that had the effect of eliminating the possibility of an auto-backup.”

As explained by Ars Technica, they way they tried to force it was to take the iPhone to a known Wi-Fi network, plug it in, and leave it overnight—which should trigger a backup to iCloud if auto-backups are enabled. But it didn’t work because the password had been reset so recently.

Not a full one. According to the February 19 filing, the FBI has Farook’s iCloud backups through October 19, about six weeks before the December 2 shooting. The filing states that the government found evidence in the iCloud account to indicate “that Farook communicated with victims who were later killed in the shootings.” (You’ll recall he killed his own co-workers.)

The filing also states:

Yes, the February 19 filing says that—they have service records from Verizon that show communications occurred, but those aren’t in the iCloud backup.

The problem with that argument There’s no way to selectively back up to iCloud—it’s all or nothing. So if communications from July, August, September, and October are not in the October 19 iCloud backup, it would be pretty surprising to find them on the phone. One logical explanation is that they were deleted by Farook before October 19.

It’s kind of a mess. First, the February 19 filing mentioned that the owner (again, that’s San Bernardino County) reset the password for the Apple ID tied to the iPhone—Farook’s iCloud password, in other words. “The owner…was able to reset the password remotely, but that had the effect of eliminating the possibility of an auto-backup.”

So that kind of read like the FBI thought the county had screwed up, but then the next day, February 20, the county’s Twitter account tweeted that the FBI had instructed the county to do so.

The FBI released a statement on February 21 to Ars Technica admitting that yes, it had ordered the password reset. But the FBI still maintains that the iCloud backup wouldn’t have everything the investigators would get if they could just get into the phone, which is why the court order was issued in the first place.

Apple has published a set of Legal Process Guidelines (PDF) that outline the process for law enforcement to request assistance from Apple, as well as what information Apple can provide.

They read in part:

However, the government’s February 19 court filing states in a footnote, “Apple has informed another court that it now objects to providing such assistance.”

There’s another case pending in New York, in which an iPhone 5s belonging to a suspected meth dealer is running iOS 7, but Apple still doesn’t want to help.

In a response filed in the New York case, Apple argues that “social awareness of of issues relating to privacy and security, and the authority of government to access data is at an all-time high. And public expectations about the obligation of companies like Apple to minimize government access within the bounds of the law have changed dramatically.” So the time is right to reexamine the authority given to the government by the All Writs Act, Apple is arguing.

It sounds like, from that filing, that Apple just wants out of the iPhone-data-extraction business. The filing explains how starting with iOS 8 Apple doesn’t have the technical ability to do what it once did, and that iOS 7 devices like this one “are becoming rare as they compromise less than 10 percent of the devices in the U.S.” Apple doesn’t want to take its engineers’ time doing the extraction or testifying in court about it, even though the company would be able to claim expenses.

After all, as the final reason argues, you can’t claim expenses for damage to the brand. “Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand. This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue.”

Both this new order and the New York case use the All Writs Act of 1789. In fact, in the case going on in the Eastern District of New York, Apple is arguing that extracting data from a drug dealer’s iPhone 5s running iOS 7 is overly burdensome on manpower and resources, as well as an overly wide application of the All Writs Act. Matthew Panzarino at TechCrunch has a great explanation, and you can also read Apple’s filing questioning the AWA.

According to the Feburary 19 filing in the California case, “The All Writs Act provides in relevant part that ‘all courts established by Act of Congress may issues all writs necessary or appropriate in aid of their respective jurisdictons and agreeable to the usages and principles of law.’” It’s kind of a catch-all, in other words: “As the Supreme Court explained, ‘the All Writs Act is a residual source of authority to issue writs that are not otherwise covered by statute.’”

The tests are whether the third party “is not so far removed from the underlying controversy that its assistance could not be permissably compelled,” that the order “does not place an undue burden” on the third party, and that the assistance is “necessary to achieve the purposes of the warrant.” In the February 19 filing, the government argues that Apple fails all three tests and thus should be ordered to comply.

If the February 16 order from Judge Pym stands after Apple’s appeal—the next hearing is scheduled for March 22—the company could elevate it through the courts, eventually all the way up to the Supreme Court.

This case could prompt legislation in Congress too, according to California Senator Dianne Feinstein speaking on PBS NewsHour. Tim Cook and FBI Director James Comey have both been invited to appear before the bipartisan House Energy and Commerce Committee.

(www.macworld.com)

Susie Ochs

Zur Startseite