# The iPhones of terrorists

On December 5, 2015, Syed Farook and his wife, Tashfeen Malik entered the banquet room in the Inland Regional Center in San Bernardino, California wearing ski masks and holding semi-automatic pistols and rifles. They shot and killed 14 people– parents, spouses, children who will never return to their loved ones.

It is the right, and indeed the duty of the FBI to investigate this horrendous crime, and to collect information on the shooters and any of their connections and actions. Some of this information resides on an iPhone that belonged to Farook, and the FBI is asking Apple to write and digitally sign code that will extract this information. As much I understand the reasons behind this request, I believe the courts should not grant it, and signed an amicus brief organized by the EFF supporting Apple’s position (see here for more briefs).

Why? First and foremost, I think that if not reversed, the court’s order to grant the FBI’s request creates very perverse incentives. Companies will learn that there is no length the government wouldn’t go into to force them to break the security of their own products. As a corollary, the more secure they make their products, the harder they will have to work to break them at the government’s request.

Moreover, this particular request is that Apple digitally signs a piece of deliberately insecure code as an authentic software update. Even if it is possible to restrict this code to work only for this particular phone, the end result will seriously undermine the trust users all over the world have in the signature of Apple and other companies. These digital signatures form the foundation of a trust ecosystem that we have come to rely upon and makes all our devices and products more secure.

Finally, while I have zero expertise on this matter, I have my doubts on whether the government truly needs Apple’s help. Extracting the information the FBI is looking for is not a matter of breaking the iPhone’s encryption, but rather its tamper resistance (or else no piece of code could help). While the iPhone’s tamper-resistance protections may deter an identity thief with a screw driver, I find it hard to believe that they are a match for the world’s greatest superpower. Indeed. from a quick search it seems that the iPhone is certified as compliant with FIPS 140-2 level 1 tamper-resistance. This is the lowest level of physical security. If Level 1 is strong enough to resist the best efforts of the U.S. government then what are levels 2,3 and 4 for? Alien technology?

Update: If you want to get some technical information about how the iPhone encryption works see Matthew Green’s and David Schuetz ‘s blog posts (written after Apple upgraded their security but before this case). The bottom line is still the same: security relies on the physical tamper resistance of the microprocessor (and in newer models than Farook’s iPhone 5C, the “Secure Enclave”) on the phone, that contains the so called UID or ” unique salt” that is used (together with the user’s password) to derivr the key encrypting the phone’s storage.

## 6 thoughts on “The iPhones of terrorists”

1. Anthony says:

I’m not sure your three reasons are convincing.

1, You suggest that “Companies will learn that there is no length the government wouldn’t go into to force them to break the security of their own products.” This is entirely hyperbole really since all the government did is precisely what you would expect them to do – and what they ought to do – namely go through the courts and let the independent judiciary decide. On the more general point, though, it hardly seems likely that companies will suddenly make their products insecure on the odd-chance that at some point they will be asked to break that security.

Moreover, this seems to contradict argument 3 where you suggest that the security cannot be that hard to break anyway. So which is it? Is the phone so secure that companies would want to reduce the security to make it easier to break into; or so insecure that surely the FBI can get in all by themselves?

2. Do you trust Apple’s signed code? The starting point of this argument is that they are trustworthy in general and hence their signed code is trustworthy. Why, then, would their signed code suddenly become untrustworthy because they were publicly forced to sign security-breaking code? If the concern is that they would collude with the government to secretly sign security-breaking code, then you have no real trust of them already.

3. This is moving into conspiracism. If the FBI don’t need Apple’s help then why are they publicly going through the courts to get it? And besides, why is this a valid reason to object? The argument is “we shouldn’t make them help us do this because they can do it all by themselves already”? It seems a non-sequitor.

1. 1. It’s not an “odd chance” – it is virtually certain that a very popular product that stores private information will be used by some bad people and that the government will want access to it. Security is not a binary attribute. Apple made some security improvements to the iPhone that meant that it now needs to work very hard, and compromise the integrity of its signing key. Other companies might think twice about making similar improvements or may decide to put a plan in place (i.e., a backdoor) that enables them to easily and quietly extract information.

2. Trust is also not a binary attribute – different users trust products to different degrees. If people delay installing software updates because they don’t trust the vendor, there would be more unpatched and insecure devices out there harming all of us. Also, it’s hard to predict how governments around the world will react to this. I’d much rather have the precedent that one cannot force a company to sign code that it doesn’t stand behind.

3. Again security is not binary. As far as we know, encryption can be truly impossible to break (in the sense of requiring $2^{128}$ or more operations). But to my (non expert) knowledge, breaking tamper resistance is not a matter of either being possible or impossible , but a matter of cost. Perhaps it would be expensive for the FBI to do this, or it requires technology that only exists in other branches of government, or perhaps the FBI simply wanted to set a precedent to make such requests easier in the future. But then again I could be wrong, and maybe a commercial-grade iPhone has tamper resistance beyond the capabilities of the U.S. government – as I said I am not an expert.

2. 💡 thx for taking a controversial stand on this controversial issue. have two recent blogs on this, the latest congress/ FBI/ apple/ silicon valley showdown! recent developments, summary, overview, topnotch links. apple has said they will argue this all the way to the supreme court. reminds me of the Clipper chip controversy from over 2 decades ago. but it seems like everything turned upside down in this country wrt cybevrsecurity after 911. heres hoping we still have rights. strongly agreed with apple that encryption is a freedom of (digital) speech issue.

3. Boaz,

While agreeing with your thesis and your argument, there are two nitpicks that I’d like to bring up:

1. The phone in question did not belong to Syed Farook but rather to his employer, the San Bernardino County Public Health Department. As far as I understand it makes no difference whatsoever to the legal argument (the dead person under investigation would have no rights to privacy anyway).

2. FIPS 140-2 level 1 tamper-resistance is no tamper-resistance. (Similarly to GSM A5/0 encryption level which is no encryption.) Level 2 requires tamper-evidence, and meaningful tamper-resistance begins at Level 3. Basically, certifying a piece of hardware for FIPS 140-2 is a business decision mostly having to do with selling it to the US government that tells us little about its actual security features.

1. Thanks Ilya. I was aware of 1 but thought it was a superfuous detail; I was hoping it’s OK to say the phone “belonged” to him, since I didn’t say he “owned” it, but perhaps I should have been clearer.

Re 2, I am not an expert on tamper resistance and any light you can shed on this would be very useful. My intuitive sense is that, unlike encryption, in tamper resistance we don’t know of a way to ensure that the cost to the attacker is exponential in the cost for the honest parties to build the device. So, I am surprised that a commercial device (which wasn’t designed to protect state secrets) would be immune to government efforts.