Why Apple Is Right To Reject The FBI’s Push To Brute Force iPhone Security

Apple is under pressure from the FBI to backdoor iPhone 5c security. The company is taking a public, principled stance on this, which is in line with its recent public pro-privacy defense of encryption. Yesterday it released a customer statement explaining that it will fight the court order, which is asking for some very specific technical assistance in order to enable the FBI to access data on an iPhone 5c used by one of the San Bernardino shooters.

Specifically the court order asks Apple: to bypass or disable an auto-erase function that wipes iPhone data after a certain number of incorrect attempts to unlock the device; to enable the FBI to attempt to brute force the passcode on the device without having to manually type passcodes into the handset but rather by affording them the ability to submit attempts via another device connected to the iPhone; and to remove a time-delay between passcode submissions, again to enable the FBI to try to brute force the passcode without having to wait a certain number of milliseconds between each attempt.

Apple couches this order as the government asking it to create a backdoor into its software. And so do plenty of others

The government, for its part, is trying to claim it’s just about one device. Apple’s counter to that is it ignores “the basics of digital security” — and also glosses over the significance of what the government is asking for.

Basically backdoor one iPhone, backdoor them all — and invite all governments, everywhere to do so…

Or as Apple puts it:

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Firstly Apple taking a public stance on this matter is A Very Good Thing because it encourages public debate on an issue where law enforcement requests have implications for the general public’s data security. It took Edward Snowden’s whistleblowing of the NSA to shine a light on state surveillance overreach in 2013 and provide the impetus for politicians to legislate to lay down some fresh privacy red lines.

tl;dr public debate about where the line should be drawn to protect citizens’ digital data from state-powered intrusions has become a core component of living in a functioning modern democracy.

Secondly, there has been a fair amount of discussion already about the technical feasibility of what Apple is being asked to do — with one security company, Trail of Bits, claiming that in its view it would be possible for the company to comply with the FBI’s requests for access to a specific iPhone and to “lock” the customized version of iOS to only work on that specific iPhone.

However that viewpoint flies in the face of the majority opinion of the security industry on backdoors — i.e. that you cannot create a backdoor just for the good guys; any vulnerability intentionally created for a specific purpose risks being found and exploited by bad actors. We see this principle in action everyday with software bugs and the hacks and data leaks enabled by such vulnerabilities. Government mandated vulnerabilities would be no different. It’s merely opening up more fronts for data to be stolen — with the added irony being that it’s your friendly state security agencies enforcing the public insecurity.

The wider point here is that when you’re talking about system design there’s no technical red line protecting security. In this example the only red line against enforced backdoors perforating iOS security would appear to be Apple’s principles — and the wider interpretation of the letter of the law by the judiciary.

Which brings me to the legal issue. The FBI has resorted to using a federal statute — the All Writs Act — to try to force Apple’s hand. This is not the first time the AWA has been used to try to compel technology companies to do the bidding of government agencies. Nor is it the first time Apple has been targeted with such Writs. Which likely explains why Apple was in a position to publish a very balanced and coherent statement on the matter yesterday. This low level federal court route of government agencies seeking to try to perforate iOS security is apparently a pretty well trodden path already.

The AWA gives federal courts the authority to issue court orders that are “necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law”. But it does not give them the power to violate the Constitution. Nor can they impose an “unreasonable burden” via Writ.

Despite the judge in the San Bernardino case granting the writ, the judiciary is not universally comfortable with use of a general purpose law for such a specific purpose. As the EFF has previously noted, a federal magistrate judge in New York last year questioned the government’s authority to use the AWA to try to compel Apple to unlock a locked iPhone in another case.

That judge’s reading of the matter is that a deliberate Congressional failure to legislate either way on enforced disabling of security/encryption might well be being exploited to enable government agencies to compel tech companies to do their bidding — i.e. without politicians having to win the public case for making a specific law for this.

“This case falls in the murkier area in which Congress is plainly aware of the lack of statutory authority and has thus far failed either to create or reject it,” the New York judge wrote.

So the implication is the government is filling a statutory gap that Congress has either failed to consider or specifically chosen not to confer authority for. Either way, use of AWA for this purpose is not a sustainable position. Calls for a proper legal mandate — in the form of a law passed by Congress and signed by the President — have started already.

Apple also understandably wants some legal clarity here. Last week, its counsel, Marc J. Zwillinger wrote to the aforementioned New York judge asking him to rule on whether it can be compelled to assist investigators to break the passcode on its iPhones — arguing that a court ruling on the matter would be more efficient than repeat debates each time the government seeks to compel it to crack the security on an individual device.

“Apple has also been advised that the government intends to continue to invoke the All Writs Act in this and other districts in an attempt to require Apple to assist in bypassing the security of other Apple devices in the government’s possession. To that end, in addition to the potential reasons this matter is not moot that the government identifies, this matter also is not moot because it is capable of repetition, yet evading review,” Zwillinger wrote. “Resolving this matter in this Court benefits efficiency and judicial economy.”

If, as Zwillinger writes, the government is intending to systematically invoke the AWA to bypass iOS security in different cases, it’s rather hard to see how it is also arguing that the San Bernardino case is a special national security exception. Either it’s “this one case” or it’s not. (And indeed, the AWA has already been used for a similar purpose in other such cases so… )

The wider point here is that legal grey areas have, for a very long time, been used as a tactic to enable state surveillance powers outgrowth without proper public debate and scrutiny of such ‘capability creep’. Indeed, actively bypassing democratic debate.

Over in the U.K., for example, we’re seeing fresh government attempts to use an obfuscation tactic to try to workaround encryption. Draft state surveillance legislation currently before the U.K. parliament includes a clause that requires comms service providers to remove electronic protection when served with a lawful intercept warrant. The legislation also states that companies must take “reasonable” steps to comply with warrants requiring they hand over data in a legible form — which would appear to imply that end-to-end encryption will end up standing outside the law.

Add to that, according to FT newspaper sources, UK intelligence agencies have been informing US tech companies they intend to use exactly this clause to force the companies to decrypt encrypted data — and that despite repeat denials by the UK government that it is seeking to ban encryption. So, in other words, the UK government seeks to seize with its right hand what it claims its left hand can’t touch.

The bottom line here is that obfuscation should not be a viable political position on the legality of encryption or system security. Data security is far too fucking important a matter to fudge.

No one would try to deny that modern smartphones contain a truckload of sensitive personal data, as Apple underlines in its public statement. And the rise of the Internet of Things is only going to increase the volume of sensitive personal data at risk of theft. (Indeed, earlier this month the U.S. director of national intelligence, James Clapper, made this very point — telling a Senate committee that: “In the future, intelligence services might use the [IoT] for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials.”)

So with the volume of sensitive data being pulled online continuing to increase, unimpeachable security is more — not less — important. Making Apple’s public defense of the security of its users the only viable position to take here.  

Because how will any technology company be able to offer trusted services to consumers if government-mandated backdoors are being forced upon them?

 

Oh and one more thing: when Donald Trump disagrees with you it’s patently obvious who stands on the right side of history.

Apple vs FBI