timcook_apple_new

There’s a rather complex situation coming out of the US in the last couple of days, arising out of the San Bernardino shooting incident which occurred in December last year. We won’t go into overwhelming detail, but in essence it boils down to this. One of the alleged perpetrators of this incident was in possession of an iPhone 5C, provided to him by his employer (a local government authority) which isย secured by a 4-digit PIN code.

The FBI seized this device during the execution of a search warrant, and for obvious enough reasons, they want to gain access to what’s on the phone. However, there’s a hurdle — not only is the device secured by a 4-digit PIN, but after 10 attempts, the device will wipe itself, not to mention that incorrect PIN attempts result in delays making further attempts. In other words, the FBI can’t get into the phone, and it wants in.

Apple has said, fairly consistently, that with iOS 8 onwards, it can’t just get into customer’s data on their phones. Previously it could fairly easily, and has done as much, but iOS 9 encrypts that data and Apple doesn’t have the key; the user’s PIN, password or other ‘secret’ is the key to that encryption. In other words, Apple can no more easily get into that data than can the FBI or anyone else.

The FBI’s position

The FBI has sought the intervention of the US federal courts to (in effect) order Apple to assist it to access the data contained within the seized iPhone. The FBI made a full application to the US courts for this assistance (which you can read in full, here) but in essence, it says the following:

  • Apple controls the hardware and software that runs on iPhones
  • Apple can modify its software to work differently
  • Apple could create a custom firmware (IPSW) which could be loaded on the seized phone
  • That custom firmware could contain crippled security features, such as:
    • No delay for incorrect PIN attempts
    • No wiping of data on failed PIN attempts
    • Programmatic PIN entry instead of manual, finger-based entry
  • Apple could provide this software, crippled to only work on the one seized device, to the FBI for it to use
  • Alternatively, Apple could load the software onto the device at its own facility, under the FBI’s supervision

There’s more to what the FBI has sought, but it argues that Apple should be compelled by court order to do these things, because of the evidential value of the information contained on the device. Further, the FBI says that Apple should be able to comply for little to no cost, and in any event, I believe the US Governmentย would indemnify Apple for any costs actually incurred (this doesn’t clearly seem to be referred to in the FBI’s application).

The US courts’ position

The application was heardย ex parte, which means that the matter was decided without input from Apple. While US law is different to Australian law,ย ex parte orders can often be reviewed or revisited on application of the affected party, in this case Apple.

However, at the time of writing, a US federal magistrate judge — Sheri Pym — has issued the order against Apple, which requires it to do the following:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Simply put, this order requires Apple to create a custom firmware file that can be loaded on to the seized iPhone 5C such that the FBI will be able to brute-force the 4-digit PIN code, and gain access to the data therein. If such a firmware were made and used, it’s likely the FBI would gain access to the device very quickly; 10,000 permutations would not take long to enter programmatically.

The order is based on a rather old law, the All Writs Act, part of the US Code which allows courts to order a person or company to do something. This law has been used previously to compel smartphone manufacturers to bypass security measures, but it hasn’t been used in this manner — to compel a smartphone manufacturer to provide the government with a tailor-made firmware to enable the security features to be defeated.

Apple’s position

Apple’s CEO Tim Cook, pictured above, has written a public letter on his response to the court order. Cookย (and Apple) were shocked and dismayed at the events that unfolded last December. Cook notes that Apple has already provided assistance to the FBI in relation to this case, by providing information from data it holds (iCloud data and backups) in response to lawful orders such as subpoenas and search warrants.

Apple acknowledges the need for encryption, while acknowledging the seriousness of the offences committed in December 2015. Apple says that the FBI is asking it to make a new version of the iPhone operating system, which doesn’t exist at present, with crippled security features that would allow “the potential to unlock any iPhone in someone’s physical possession”.

Understandably, perhaps, Apple is reluctant to comply. It says, amongst other things:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In todayโ€™s digital world, the โ€œkeyโ€ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But thatโ€™s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks โ€”ย from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades ofย security advancements that protect our customers โ€” including tens of millions of American citizens โ€” from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Analysis of Apple’s response

There is little doubt that what the FBI is asking Apple to do is quite serious, and a significant development in law enforcement in the digital age. Equally, there is little doubt that Apple wants to do the right thing, both by its customers and by a legitimate public need for law enforcement.

From a technical standpoint, it does not appear that there is anyย technical reason why Apple could not create the software that it is now compelled to do so.

Apple can, and does, provide software updates that can be applied to an iPhone without wiping or compromising the data held therein. Apple could, quite easily I would suggest, create a special iOS build that contains the three features sought by the FBI (programmatic PIN entry, removed PIN delay, and removing the code that wipes the iPhone after x number of incorrect PIN attempts). In fact, to do so would be trivial, and would require minimal changes to iPhone’s operating system source code.

How is this possible? Easy. Apple already does it, and Google (and Android OEMs) do it too. System updates necessarily have to be able to update virtually every aspect of the software running on a mobile handset, from the low-level bootloader to the ‘system’ partition which holds the software running on the phone. There is no technical reason why Appleย couldn’t do what it is being ordered to do.

We also note that while much of the mainstream media is talking about Apple handing over this software to the FBI, and the FBI consequently being able to use that software carte blancheย in future cases, that’s not actually what the order says. In fact, quite the opposite. The court order gives Apple the option to create this software and load it onto the subject iPhone within its own premises, by its own engineers, subject only to the FBI’s supervision (and the PIN entry, for evidential reasons).

The FBI need not gain control or possession of the software. Even if it were to do so, the court order permits (and indeed compels) Apple to create this software such that it wouldย only work on this particular iPhone — presumably locking it to unique identifiers such as the phone’s serial number, IMEI or others. It would not be trivialย for the FBI, even if it were to take possession of this software, to modify the software to work on other seized phones — for one, there’s a number of iPhone models on the market, with different OS versions, and for two, reverse engineering Apple’s proprietary software to modify it in the manner suggested would be unlawful, rendering any evidence obtained in such a way useless.

Apple’s opposition isn’t, or shouldn’t, be technical — of course it can do what it’s required to do by this order, and to suggest otherwise is a smokescreen only.

Apple’s concern is that if it shows the FBI that it can do this for one phone, it opens itself (and other manufacturers) to future compulsion to do it in other cases too. The difficult with this is that the horse has already bolted. The FBI isn’t staffed with bumbling idiots; in fact, its computer forensics capabilities are well known and respected.

The FBIย already knows that Apple can do what it’s asking, and it wouldn’t have asked otherwise. The FBI also already knows that Apple (and other manufacturers) could be compelled to do what it wants, in this regard, because Apple has now been compelled to do it. Granted, the decision will likely be appealed, and could well be overturned, but for the time being, the status quo is indeed in the FBI’s favour.

Apple’s stance, though admirable, is at odds with the will of the US Government, and if it is unsuccessful by these means (using the courts, theย All Writs Act, etc), then other means will be devised, perhaps with legislative change that could be well received by US lawmakers who gave us things like theย PATRIOT Act, amongst others.

Apple may be overstating things; it is not being required to build a general backdoor to its products that could readily be used by third party, non-state actors to break into iPhones everywhere. This is, hype and hysteria aside, a rather unique case — and the court’s order is rather more limited in its effect. Yes, it does create a slippery slope, perhaps, but it isn’t as open and shut as Apple would have us believe via its ‘customer letter’.

Google’s position

It’s perhaps unsurprising, with this background, that Google supports Apple’s position. Google’s CEO, Sundar Pichai, took to Twitter this morning in a series of tweets:

Overall Analysis

Should companies be compelled to create custom-made software like this to facilitate law enforcement? Thats a rather simple question, with a rather difficult answer. On one hand, the iPhone in question almost certainly contains information which would assist law enforcement in prosecuting a rather important criminal investigation. On the other hand, it sets a bit of a precedent that encryption of devices is only so good; the government can compel companies to cripple and break those security measures essentially at whim.

Could Apple (or other OEMs) take steps to prevent such a compulsion being made in future?ย It’s possible, but perhaps not easy. Even if the security subroutines were placed in a read-only storage on future iPhone devices, those subroutines would need to be referenced somewhere from the operating system, and those references could be changed/removed as needed by a custom software update. A cat and mouse game (such as we saw with jailbreaking) could ensue, but in the long run, any technical measures implemented by Apple could likely be defeated in time.

Could a consumer protect their information from government access in this fashion? Of course; the San Bernardino suspect used only a 4-digit PIN to secure his phone, which could be readily defeated by the FBI with custom software. However, if that passcode were alphanumeric, or longer, or a biometric such as a fingerprint, even with crippled software, it would likely take the FBI or any other agency significantย time, if not many years, to defeat.

The issue at hand is rather complex, and mixes questions of law with public policy that are perhaps beyond Ausdroid’s remit … we can see that Apple (and Google’s) stance is just. Customers are entitled to protect their information using tools provided by Apple, on the understanding that those measures aren’t readily defeated or liable to be defeated if Apple were minded or compelled to do so.

On the other hand, the government (through the FBI) has a legitimate need to access the information contained on that iPhone in question in order to prosecute an offender for what is a very serious and horrible crime. There’s no doubt that terrorist acts are abhorrent, heinous crimes, and the public benefit to investigating and prosecuting those crimes could well override an individual’s right to privacy (and a manufacturer’s right, or desire, not to be compelled to help) in some circumstances.

Is this one of those cases? Perhaps it is. You be the judge.

9 Comments
newest
oldest
Inline Feedbacks
View all comments
Dave

Really enjoyed reading this. Thanks for not sensationalising it – unlike Mr Cook. This is not a master key the US Court is asking for, but the keys to one house.

Dave

And look, if Apple can already hand over your iCloud data based on a court order, what’s the privacy problem with helping to unlock a phone also by court order??

Craig

I don’t understand why this is such a big deal. Just obey the court order already. Why should confirmed criminals have a right to digital privacy? Seems like Apple is trying to be some kind of digital privacy white knight. I have a suggestion to Tim Cook: You’d be a better corporate citizen by paying taxes than disobeying court orders and obstructing the course of justice.

Chris

The suspect hasn’t been convicted yet so he’s not a confirmed criminal … at least not as I understand that term.

Shahil Prasad

Is the nature if the crime relevant? What will define these limits? .. Or is it going to be a case where the law enforcement party simply asks a judge and hopes to get lucky. What about if there phone is encrypted by finger prints. Would that be kosher still? What I don’t get is if they’re compelling Apple to go through all this trouble, why not just ask for the phone to be unlocked? Perhaps the mastekey analogy makes more sense here. And here I was thinking the FBI and CIA have master IT people that can crack phones… Read more ยป

kjmci

> why not just ask for the phone to be unlocked?

Apple don’t know the device PIN and the authorities cannot compel the accused to incriminate himself (Fifth Amendment).

Interestingly, the understood position is that if there was TouchID the accused could be compelled to provide his fingerprint as that’s not something that’s “known”, but something that “is” and there is precedent for suspects being compelled to provide biometrics such as DNA, fingerprints and hair samples.

Chris

Great comment, esp on the biometric vs pin code. Thanks for the input Kieran ๐Ÿ™‚

AdamM

That’s an intriguing one. “Providing your fingerprint” is, arguably, not the same as “using your fingerprint to unlock an electronic device”, although as we are talking about US law who knows how that argument would play out. In the latter case, surely you run into the same Fifth Amendment argument as compelling someone to enter a PIN?

Bananafish

Great article – well done, Chris and Ausdroid!