Monday, February 22, 2016

Solving the Apple/FBI iPhone Privacy Dilemna

The Solution (in Brief)


Apple is right: they can't build security products with backdoors that allow the FBI to gain access anytime they want,  as that would render the security of a device like an iPhone useless. The FBI is also right, in that they can have, at times, a legitimate need to look at the data associated with certain accused and convicted persons, to aid in securing our nation and protecting its citizens.

Both parties being right, compromise is required. The short answer is for Apple to record and store the passcodes generated by iPhone owners in a secure manner, and to require the FBI to obtain a search warrant to gain access to passcodes it wants. This post describes the rationale behind such a compromise, and why it can work in more detail.

Disclaimer


I'm a computer scientist, not a lawyer. Enough said.

The Problem


Recently, the FBI and Apple have clashed on the issue of security and privacy. This came about because the FBI, in search of data related to the mass shooting in San Bernardino, CA (https://en.wikipedia.org/wiki/2015_San_Bernardino_attack) is understandably interested in gaining access to the content on the iPhone of one of the deceased suspects, Syed Rizwan Farook. Apple, for its part, prides itself on making encryption a part of the iPhone operating system, iOS. Users can enable this encryption by securing their phone with a passcode.  This article (https://support.apple.com/en-us/HT202064) describes how to enable a passcode, and if you haven't secured your iPad or iPhone with a passcode, you really should do it now.

The FBI is, as I mentioned above, understandably justified in wanting to see what is on that phone. The FBI is only doing its job, trying to understand all they can about the suspect and his communications; the need to follow all leads in an investigation like this is not one that is new to the digital age. It reminds me of the efforts of Alan Turing, and others, that allowed allied forces to peer inside the Nazi Enigma devices in World War II (https://en.wikipedia.org/wiki/Cryptanalysis_of_the_Enigma), and the related efforts of Magic project to crack the Purple codes (modified versions of Enigma) used by the Japanese to hide their communication while at war with the United States and its allies (https://en.wikipedia.org/wiki/Magic_(cryptography)#PURPLE_traffic). History states that being able to decrypt the messages being sent by the Japanese was critical to defeating them in World War II. Without advance intelligence betraying the plans of the Japanese forces, we would have lost at Midway due to being technically outnumbered in terms of naval vessels, the result of the bombing of Pearl Harbor which severely crippled the US fleet.

Apple, for their part, is equally "in the right". People's lives are essentially stored on their phones. Online banking, e-mail communication, personal notes and photos are stored on these devices. In the wrong hands, the compromise of this data could lead to identify theft, extortion, and a host of other problems. Customers, rightly so, demand that a device as vulnerable as a mobile phone, carrying such sensitive data, be protected from prying eyes should it ever be lost or stolen.

The protection afforded an iPhone comes from technology that at its core is really no different than the technology that was used by the Germans and Japanese in World War II. But times have changed. The technology of WWII was quite weak, and teams like those that broke Enigma and Purple stood a chance of decoding messages because of the primitive nature of encryption employed during that era. Mind you, bright people had to be involved in cracking these codes, but it was very possible. That is no longer the case, much to the dismay of the FBI.

Encryption in Simple Terms


Let's define the basic task of encryption. Encryption is inherently a simple to understand concept: take a message, or image, or text file, or anything else that is stored on a computer, and re-write it in a way that makes it infeasible, without possessing a secret "key", to convert it back to its original, unencrypted form.

Assume we have a key, unique to each iPhone and/or user, and a message to encrypt (for example, the words "My boyfriend is out of town, let's meet tonight for a rendezvous"). To encrypt the message, the key, and the message, are given as inputs to special software (or hardware) which is designed to perform the task of encryption. The software takes the key, and the message, and combines them in a way which transforms the original message into seemingly unrelated text, such as "sdfds88kjcmnvkasdd34x". If a thief (or the FBI) were to get ahold of my phone, and look inside, they would need to decrypt this nonsensical content in order to get the original content back. But they can't do that without the key, and this key is protected by a passcode. This is what makes the iPhone essentially secure, and what makes the need to use a passcode vital.

But couldn't the FBI just work to determine the key, just like the allied forces did in World War II to crack the German and Japanese codes? Assume for the moment that the FBI couldn't guess your passcode (we'll talk on that later), and only had access to the encrypted content (if they could guess your passcode, then they could just read the phone like you do, everything would be available). In order to decrypt what is on the phone, they would need to defeat the underlying encryption on the phone, which is 256-bit AES. The key size, 256-bit, in part means is that to brute-force guess a key would take a supercomputer (of today's capabilities) 13.75 billion years.

Inherently, then, the brute force guessing of a 256-bit AES key is computationally infeasible and thus impractical; no one is going to live long enough for a successful effort to even matter (6 months from now the information on that iPhone is probably not going to be useful to the FBI, let alone 13+ billion years).

Cracking iPhone Passcodes


Of course, the passcode is just as important. If I were a thief, I'd certainly be trying some easy to guess passcodes, like 0000, 1234, or 4321, etc., with the hope that this is enough to let me break in to a phone in my possession. Sadly, people aren't all that good at choosing passwords or passcodes, and so a great many devices and accounts can be compromised with such a simple attack. But how long would it take our thief to guess a passcode that was chosen by the user carefully, using a brute-force method like trying 0000, then 0001, then 0002, then 0003, and so on until the right passcode was guessed?

The following web article: https://theintercept.com/2016/02/18/passcodes-that-can-defeat-fbi-ios-backdoor/ suggests that it all depends on how many digits your passcode is composed of, as well as the inherent features of the iPhone related to entering your passcode that are intended to thwart users who are trying to break in to your phone. To quote the author:
  • seven-digit passcodes will take up to 9.2 days, and on average 4.6 days, to crack
  • eight-digit passcodes will take up to three months, and on average 46 days, to crack
  • nine-digit passcodes will take up to 2.5 years, and on average 1.2 years, to crack
  • 10-digit passcodes will take up to 25 years, and on average 12.6 years, to crack
  • 11-digit passcodes will take up to 253 years, and on average 127 years, to crack
  • 12-digit passcodes will take up to 2,536 years, and on average 1,268 years, to crack
  • 13-digit passcodes will take up to 25,367 years, and on average 12,683 years, to crack

The author suggests 11 digit passcodes are the best, though I think most people will not find that number of digits to be 
practical (who would want to enter 11 digits to gain access to their phone?)

The design of the iPhone helps somewhat to make small, usable (but carefully chosen) passcodes secure. To quote the author again: 

"One obstacle to testing all possible passcodes is that the iPhone intentionally slows down after you guess wrong a few times. An attacker can try four incorrect passcodes before she’s forced to wait one minute. If she continues to guess wrong, the time delay increases to five minutes, 15 minutes, and finally one hour. There’s even a setting to erase all data on the iPhone after 10 wrong guesses".

As I see it, the above makes even a 4 digit passcode pretty secure for most of us, as it is hard for me to imagine that the run of the mill thief is going to want to spend the time to gain access to my phone unless it is clear to the thief that my phone contains data that is worth the effort.

What The FBI Wants


The FBI wants, essentially, for Apple to provide software that allows them to get around the extra obstacles that Apple has put in place to make it practically impossible to brute force attack the passcode space. They don't want the phone to add delays if wrong guesses are made. With this software, they intend to build what amounts to a machine that can be given a phone, and bang on the screen, trying passcodes one-by-one, until the attack is successful. Of course, this means that with this software, anyone can build such a machine, even the bad guys.

Possible Solution: A Passcode Registry + Search Warrants


Back to the current problem facing the FBI and Apple. The FBI wants Apple and others in the industry to add some sort of backdoor that would allow it to circumvent the security built into iPhones (and other devices). Apple wants none of this, as they know consumers will balk if they think that Apple will bend over backwards for the FBI (or anyone else for that matter). And Apple is also correct that a backdoor would eventually lead to "bad guys" being able to gain access to phones, rendering the security of the phone useless. Silicon Valley is strongly united in opposing such a backdoor, as is any security researcher worth his or her salt.

In the end, I think the problem of securing a phone is not much different than that of securing your home. Any home has at least as much, if not more, sensitive information about the persons living there as does their phones. The government is free to inspect a home as long as they can get a search warrant and have probable cause. As a society, I would presume that we all would be ok with the idea that our phone is subject to similar scrutiny. If a lawyer can prove to a judge that a search of my phone is vital for whatever reason, it's not much different than giving up the keys to my house.

With such a warrant, the FBI could walk up to someone, ask that person to provide the passcode to their phone and the phone itself, and then access its data. But what if the person were dead as in the case of the San Bernardino shooter? How would the FBI obtain the passcode? The only way I see this happening would be for the passcodes to be available to the government. A way to do this would be for Apple to transmit, over an encrypted connection like HTTPS, each passcode as it is changed by a user, and place it in a database that is keyed by the unique identifier associated with the hardware of the phone (each phone has such an ID).  Gaining access to the passcode would be a matter of looking it up in the database, using the unique ID of the phone as a key. With a suitable search warrant, of course.

These passcodes would be stored in an encrypted form in the database, just in case the database were compromised, so the risk would be small if the database fell into the wrong hands, assuming the job is done competently. 

Should the government want to see inside a phone, like the one owned by the San Bernardino shooter, it would go about is as follows:
  1.  come up with probable cause, and get a search warrant
  2.  obtain the phone
  3.  retrieve the passcode from the database
  4.  use the passcode to unlock the phone and gain access to its contents
Again, this would be much like the process the FBI would go through to get a search warrant and search a home, which we've all come to accept. The solution doesn't impact the overall security of the phone, but it does gives the FBI a legal way to get at the contents, if a judge agrees, and eliminates the need for Apple to compromise on the overall security afforded users of the iPhone and iPad. In effect, a backdoor without actually having to implement a backdoor.

No comments:

Post a Comment