Apple, Privacy, and Doing the Right Thing

Steve Jobs from a South Park episodeHere’s the deal. A Federal court has ordered Apple to comply with the FBI’s request to help break into the encrypted iPhone of one of the dead shooters from the San Bernadino shooting in California back in December. Apple publically refused in a well written letter that defended the importance of privacy and was signed by Tim Cook.

Who’s right?

It wouldn’t take a genius to determine that I might instinctively side with privacy and Tim Cook. I’m a big believer of ethical behavior in the tech world, the importance of firms protecting consumers from their own ignorance, and am proud that Tim Cook is a fellow Auburn grad.

But it isn’t that simple.

The Request

The request by the FBI, on its surface, is a reasonable one. They are trying to learn as much about the shooters and any possible coconspirators as they can. They have the phone legally and they are trying to access it. They are unable to do so because of the security features which include encryption and a data-wipe feature that occurs after 10 failed attempts to access the phone.

They are simply asking the makers of the phone for help breaking into the phone. They aren’t asking Apple to decrypt the phone or unlock it. They aren’t asking for a backdoor to be created allowing anyone with a key to access the data. They just want Apple to update the operating system (OS) on that one iPhone to allow them to try and break into it using brute force methods without risking the lose of all the data.

It seems reasonable.

Apple Says No

Tim Cook then talks about the importance of privacy and how creating this special version of their OS would not be just a one use item.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

This sounds great but then I looked deeper into the issue. Bloomberg had a great article outlining the facts of the request and Apple’s response. It was when I read Ben Thompson’s take on the overall impacts that my opinions started to fluctuate quite a bit. I began to think that maybe, just maybe, Apple was wrong.

Standing on a Slippery Slope

Who is correct?

Privacy is important and encryption is a key tool in protecting privacy. It is not something to be discarded lightly. Of course in this instance, the person whose privacy Apple is protecting is dead.

What if the terrorist had used their own encryption on a device and were asking Apple for their expertise in breaking the encryption? Apple, like most companies, would likely help.

The premise of the defense is that this would create a tool that would allow whoever possessed said tool to begin a brute force hack of an iPhone. Worse, it would set a precedent for repeating the process for future versions of the iPhone.

This is where I get off the privacy train though. For a phone to be attacked with this tool, a person would need:

  • A correct version of the tool: The tool in question is an OS that is specific to the version of the phone. Some tools may work on multiple versions but it would not be universal. The requested iteration of this tool would not work on newer iPhones that have the secure enclave component.
  • The iPhone in question: This is not a remote attack. A new version of the OS would need to be directly loaded onto the phone and that is not something that an external entity can perform remotely.
  • Time: Even after the OS is updated, all that has happened is allowed for an automated brute force attack to determine the passcode. Depending on the length of the passcode, this could take a while.

I understand the desire to protect privacy and for customers to trust that the devices and services they use are secure. There needs to be a balance for legitimate requests that do not threaten the privacy of anyone other than the focus of the request. It doesn’t have to be easy but if it is possible, it should happen.

With a little more effort, Apple can make even a brute force attack time consuming. Using a slow-hashing approach for the passcode, Apple could force the iPhone to take a 250ms to 500ms (half a second) to confirm a passcode entry as opposed to the current speed of 80ms. While that speed would still be tolerable to owners, it would slow down brute-force hacks quite a bit. It would increase by a factor of 4-8 the length of time to crack an iPhone. Even at the current speed, an 6 character passcode could take years.

That is a lot of effort for a government, or anyone, to put into cracking a single phone.

The Solution?

Apple should do the following:

  • Reach an agreement with the FBI that they will install the requested OS onto the phone but they will perform it themselves. The FBI does not get to keep the tools to apply to another phone.
  • Publically state that in this situation they are working with the FBI to allow them to break into the phone because of this horrible nature of this crime and that the owner of the phone is dead, leaving the FBI no other options. They should stress that they are not breaking the encryption, merely making the task of guessing the passcode possible.
  • Make the solution work only on that iPhone. Use the serial number or some other unique identifier to make that version of the OS functional only on that device. This will make reverse engineering the solution using only the phone more challenging.
  • Continue to develop improved security that is built into the hardware. They’ve made things more complicated with their secure enclave technology but that is not impervious to this approach either.

I know many of you will disagree on this but I think that if carefully handled the slippery slope can be avoided. That is what we fighting, not simply gaining access to an iPhone which Apple has helped prosecutors with in the past.

This specific request does not make our devices less secure. It does not allow anyone to access our phones while we possess them. Even if lost, if your passcode is six or more characters they are going to have a heck of a time breaking into the phone.

If you are someone who is likely to be targeted by an organization with the resources to hack your phone using this approach; depending on Apple’s security solely would be a mistake. For the rest of us, the level of effort required to get to our information wouldn’t make the effort worthwhile.

I admire the stand that Apple and Tim Cook are taking. They are looking down into the abyss and trying to not inch closer to the edge. For that I thank them.

In this specific case, I think they should take that small step. Carefully.

5 thoughts on “Apple, Privacy, and Doing the Right Thing

  1. We should queue this up for our next beer. I don’t disagree so much as I worry that the federal judge(s) don’t understand what they’re ruling on and the FBI can’t be trusted, and the next judge won’t look at this as a huge precedent and… Federal court rulings start out like a very fine scalpel but often end up looking more like a machete. Kudos for tackling the issue and doing some good analysis.


  2. SteveB says:

    It may be that Apple wish to avoid the commercial burden of all the subsequent request that this would set a precedent for. Also if they can do this for the US government then presumably a case might be made that they should also do so for any other government or regieime.


Comments are closed.