iPhone Security

February 29, 2016

(This post has an update from August 4, 2016 at the bottom.)

Background

On December 2, 2015, Syed Rizwan Farook and his wife killed 14 and seriously injured 22 in a terrorist attack. (While we're on the subject of privacy, supposedly a reporter paid $1000 to get into their apartment.)

In its investigation, the FBI lawfully seized Farook's phone and reset its iCloud password on December 6. They were able to recover an iCloud backup of the phone from October 19. (Apple says this move "closed off the possibility of recovering information from it through ... automatic cloud backup".) But the FBI announced on February 9, 2016 that it was unable to get any information after the October 19 backup. The DOJ applied for an order to compel Apple to help them get access to the data, believing that "there may be relevant, critical communications and data around the time of the shooting that has thus far not been accessed" (page 23). (The DOJ applied under the All Writs Act of 1789, and there is plenty of discussion elsewhere about that.)

Drama

Apple CEO Tim Cook published a very public letter saying that Apple would not comply with the order.

In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

FBI Director James Comey replies:

The particular legal issue is actually quite narrow. The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve. We simply want the chance, with a search warrant, to try to guess the terrorist's passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That's it. We don't want to break anyone's encryption or set a master key loose on the land.

...

I also hope all Americans will participate in the long conversation we must have about how to both embrace the technology we love and get the safety we need.

Obviously, there is a lot more drama/media frenzy over this, but let's get to the boring stuff instead. Two questions should arise from this:
1. if Apple can decrypt the contents of my iPhone, is it secure?
2. would creating the OS/firmware/backdoor compromise the privacy of other iPhones?

Security

Farook's phone is an iPhone 5c running iOS 9. It is locked with a 4-digit, 6-digit, or arbitrary-length alphanumeric passcode. iOS 9 makes you wait (5 seconds, increasing exponentially) in between failed passcode attempts. It is also possible that Farook enabled a feature where the phone wipes its encryption keys after 10 failed attempts. Brute-forcing the passcode by hand is out.

Each phone has a unique ID (UID). The UID and passcode together are required to decrypt the device. According to the iOS security whitepaper,

The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed by dedicated AES engines implemented in silicon using the UID or GID as a key. ... The UIDs are unique to each device and are not recorded by Apple or any of its suppliers.

More details can be found here (very interesting is "Has Apple ever provided brute-force passcode breaking services to law enforcement? Has iOS been modified to restrict or eliminate this attack? ... If asked, would they refuse?").

This means one cannot simply remove the flash storage and brute-force the passcode with specialized hardware – you would need to brute-force the 256-bit UID.

If Apple provides an OS/firmware/backdoor that bypasses the arbitrary failure wait and 10-fail-wipe, there are still other issues. The key used to encrypt the data is generated by a key-derivation function with an iteration count "calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 5½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers" (whitepaper page 12). Of course, if the passcode is 6 digits, it takes about 20 hours and if it's 4 digits, 12 minutes.

The FBI can't write such software on its own because iPhones will only boot to a kernel signed by the Apple Root CA (whitepaper page 5).

Can the UID be physically read off of the silicon? Maybe, but there is a chance it destroys the chip, it's costly, and it assumes you already know where on the chip the UID is stored.

So your iPhone is secure if you believe that
1. the adversary does not have the resources to decap your phone and get the UID
2. the Apple code-signing private key is secure or you have a good, alphanumeric passcode
3. you cannot be compelled to unlock your own phone or can remotely wipe it in time
4. your data cannot be retrieved off an iTunes or iCloud backup

EDIT: LOL WHOOPS. SEE THE UPDATE AT THE BOTTOM.

Privacy

Apple's argument for refusing the order is two-fold:
1. "it would be wrong to intentionally weaken our products with a government-ordered backdoor"
2. "the order would set a legal precedent that would expand the powers of the government"

Let's examine #1 first. On the letter's FAQ page, Apple writes:

In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.

Apple is describing an OS without the restrictions covered above that would work on any iOS device. In the DOJ/FBI's application on page 10, however:

Importantly, the SIF ["Software Image File"] would be created with a unique identifier of the SUBJECT DEVICE so that the SIF would only load and execute on the SUBJECT DEVICE.

And this seems plausible. I see no reason why this theoretical OS can't be made to only boot on a specific device. It would be signed, like all other iOS's, with the Apple Root CA, which theoretically only Apple has the private key for.

Speaking of the code-signing key, isn't that also a "master key, capable of opening hundreds of millions of locks"? If we are "in a world where all of our data is under constant threat" and "no one is immune to cyberattacks", how can we predicate the security of iPhones on the security of the code-signing key?

#2 is much more involved, but the short version is that I'm in agreement. So is Magistrate Judge Orenstein: in a ruling in a separate case in New York on February 29, 2016, he denied a government request (also under the All Writs Act of 1789) to unlock an iPhone (this will almost certainly be appealed by the government). In the opinion, he writes:

as the Court explained in N.Y. Tel. Co., the AWA [All Writs Act] does not empower a court to impose any burdens that are "unreasonable" – and it said nothing to suggest that only financial burdens could prove unreasonable.

... Nothing in the government's arguments suggests any principled limit on how far a court may go in requiring a person or company to violate the most deeply-rooted values to provide assistance to the government the court deems necessary.

To try to gauge that limit – and to see if one even exists – I deliberately asked the government at oral argument ... about whether a court could invoke the AWA to force a drug maker to supply lethal injection drugs notwithstanding the manufacturer's conscientious objection to capital punishment. ... [I]n its post-hearing submission, the government offers nothing more than deflection: "Resolution of the death penalty hypothetical would depend on the particular law, facts, and circumstances if such a case were to present itself." That is undoubtedly true, but ... unsatisfactory. If the government cannot explain why the authority it seeks here cannot be used, based on the same arguments before this court, to force private citizens to commit what they believe to be the moral equivalent of murder at the government's behest, that in itself suggests a reason to conclude that the government cannot establish a lack of unreasonable burden.

And later:

It would betray our constitutional heritage and our people's claim to democratic governance for a judge to pretend that our Founders already had that debate, and ended it, in 1789.

Ultimately, the question to be answered in this matter, and in others like it across the country, is not whether the government should be able to force Apple to help it unlock a specific device; it is instead whether the All Writs Act resolves that issue and many others like it yet to come. For the reasons set forth above, I conclude that it does not.

In a footnote, he adds:

Indeed, as FBI Director Comey observed in the context of the California action:

[W]e have awesome new technology that creates a serious tension between two values we all treasure: privacy and safety. That tension should not be resolved by corporations …. It also should not be resolved by the FBI …. It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before…. I also hope all Americans will participate in the long conversation we must have about how to both embrace the technology we love and get the safety we need.

[...] That "long conversation" among "people deciding how we want to govern ourselves in a world we have never seen before" will of course be moot if courts presume to make that decision for the American people based on a perceived assignment of extraordinary authority by 18th Century legislators. Director Comey's salutary call for meaningful public debate can therefore be achieved only by recognizing that the All Writs Act does not serve as a mechanism for courts to give the executive branch authority it fails to secure from the legislature.

(Quoted ellipsis his, bracketed ellipsis mine.)

Conclusions

Is your iPhone secure? Meh. (But what security it does have is really impressive.) [EDIT: WHOOPS. SEE UPDATE.]

Is Apple being asked to irreversibly weaken the security of its customers? Not meaningfully.

Should I be able to assume that neither I nor my phone's manufacturer can be compelled to decrypt my data? Yes, that would be nice.

Would such orders be a reasonable interpretation of the All Writs Act of 1789? IANAL, but in my echo chamber the answer is a resounding "no".

Update (August 4, 2016)

Hah, just kidding! Some undisclosed third party helped the FBI get into the phone. FBI Director Comey hinted that they paid somewhere north of $1 million.

It's unclear whether the FBI purchased a one-time use of an exploit or the technical details of such. From the first article:

If the government shares data on the flaws with Apple, "they’re going to fix it and then we’re back where we started from," Comey said ... "we’re considering whether to make that disclosure or not."

But a month later:

But the bureau told the White House last month that its understanding of how a third party hacked the phone was so limited that there was no point in undertaking a government review.

Comey said Wednesday that the bureau purchased only the tool, not the rights to the software flaw.