You have most likely heard about the ongoing argument between Apple and the FBI, as well as the surrounding debate. Many people have explained why they support one position or the other, so there is little value in me adding yet another opinion piece explaining my rationale. Instead, I would like to shed some light on who has what information, what they can or cannot do to access it, and how the technology protecting it works.
It all started with the FBI asking Apple for help decrypting an iPhone they hold in their possession, and Apple publicly denying that request. According to the FBI, they cannot unlock the iPhone because it only allows 10 attempts until all storage is irreversibly erased.
Apple’s reasoning for not helping the FBI unlock the iPhone is simple. According to them, the only way of bypassing the attempt limit is deploying a new firmware on the device (it’s currently running iOS 9). Should they write such arguably dangerous software, they continue, they won’t be able to get rid of it afterwards, because one of two things will happen: Either the FBI will ask Apple to hand over the firmware, thereby granting them access to every iPhone in the world, or Apple will maintain custody of it, with the FBI asking them to deploy it to an iPhone every once in a while. Tim Cook’s opinion on that matter is that such software would be the digital equivalent of cancer, and its custodian would be subjected to increased exposure to hacking attempts by criminals, foreign governments, and other dubious parties.
This exchange poses multiple questions:
- Is installing a new firmware the only solution to unlocking the device?
- If so, can only Apple develop and/or deploy it?
Let’s start with the second question. Every iPhone’s hardware has a hard-wired public key. That is the public key of Apple’s root certificate authority, which only Apple has the private key to. Thus, the hardware only allows firmware installations that have been signed with Apple’s private key. That means that Apple is, in fact, the only entity with the ability of installing operating systems on iOS devices.
Yet according to Tim Cook, Apple would be unable to adequately protect the attempt-limit-disabled firmware, yet they are perfectly fine with storing their certificate’s private key. Which means that either hackers should theoretically be able to obtain that key already, or Apple would be able to maintain custody of that custom operating system.
Regardless, both parties seem to agree that Apple installing a new firmware is the only way of getting into the phone. It must be noted that there is an important distinction to be made: There is a difference between unlocking a phone and getting access to the data it contains. Unlocking it implies taking the route through the operating system, and subsequently using the phone’s own interface to browse its data. If, on the other hand, only the data has to be obtained, the data can be copied over to a different device and then explored from a regular computer.
Get Arik Sosman’s stories in your inbox
Join Medium for free to get updates from this writer.
In this case, the problem with the second approach is that the phone is encrypted. People who are familiar with Apple devices could think of other approaches to simply unlock the phone. For example, if the phone had Touch ID, one would assume that taking the owner’s fingerprint, molding it using conductive rubber, and pressing that on the home button would do the trick. However, there are two problems with this approach:
- Touch ID access is disabled after not using it for a certain period of time, and after a certain a number of failed unlock attempts.
- The phone in question is an iPhone 5c, which doesn’t have Touch ID.
Thus, on the assumption of Apple’s continued non-cooperation, breaking the phone’s encryption would seem to be the only way onward. In order for any encryption to work, the key must never be stored on the same medium as the encrypted data. Specifically, the key is derived on-the-fly every time the user enters their passcode to unlock the phone, and then used by the operating system to decrypt the file system. Considering that the phone has a four-digit-passcode, there are ten thousand possible keys. A modern computer should be able to run through them within no more than a few hours.
This has a multitude of scary implications. First, the FBI is lying about being reliant on Apple’s help to get the phone’s data. Not only do they not need it, but most likely, another three-letter-agency, the NSA, has already cracked the same problem, albeit not for the phone in question.
Second, Apple is lying about being unable to maintain custody of the non-rate-limited operating system. If they are able to store the private key of their root certificate authority securely, storing the software should be just as easy (or just as hard, but still possible).
Third, the only thing stopping Apple from getting into anybody’s iPhone is the moral compass of their CEO. Both Apple and the government have ways of accessing our most private data, despite both claiming otherwise. This is inherently wrong, because the security of our data must not hinge on the phone manufacturing company’s CEO being benevolent. Our privacy remains an inalienable right only as long as it is manifested in cryptography.
Fortunately, Apple’s CEO is benevolent, and Apple is looking into ways of remedying these security holes. For starters, all iPhones since the iPhone 5s, that is all iPhones with fingerprint scanners, have had a hardware-separated “Secure Enclave.” That is a coprocessor that handles all cryptographic operations, whose memory is encrypted, and which the Touch ID sensor communicates with using a write-only-bus. It is responsible for key storage and, when too many failed unlock attempts, all of which it processes, are made, it erases its memory. Its symmetric key is determined at manufacturing time based on its separate UID, and neither the key nor the secure enclave’s UID are known to Apple.
And yet, the secure enclave does support software updates, separate from the operating system upgrades though they may be. Which means that with physical access to the device, Apple can still update it to eliminate the attempt limiter and thus allow itself to unlock the device.
To rob itself of this ability, according to rumors, the upcoming version of iOS (9.3 at the time of this writing) is supposed to disallow firmware upgrades on locked phones. That measure should eliminate the unlocking route once and for all.
Even so, the device can still very easily be decrypted externally, especially when only using a four-digit-passcode. A six-digit-passcode, which extends the set of possible keys to a cardinality of one million, can still be brute-forced within about a day. So how can we properly secure our phones?
The answer, which is based on the assumption that the phone has Touch ID, is quite simple. Instead of using digit-only passcodes, we should be using alphanumeric passwords on our phones. Considering that they rarely need to be entered due to the availability of Touch ID, the convenience of using the phone is hardly diminished, if at all. The security, on the other hand, goes up exponentially, and the math is very satisfying.
Disregarding the use of special characters, each character can be a lowercase or uppercase letter, or a digit. That is 62 possible values. A ten-letter-password has 62¹⁰ = 8.4 * 10¹⁷ possible combinations and permutations, which, even with all the tools of a sophisticated government agency at one’s disposal, would take decades to crack (provided there are no usable heuristics).
In other words, stay safe!
PS: Regarding the title. “Dormant cyber pathogens” are a myth that should never be used outside the context of CSI: Cyber.