IE 11 is not supported. For an optimal experience visit our site on another browser.

Bill Barr wants Apple to break their encryption, even if it risks every American being hacked

Law enforcement thinks tech companies should build "backdoors" into every person's phone and communications apps, and then hope no hacker ever finds them.
Image: US-Saudi-diplomacy-crime
Attorney General William Barr walks past pictures of the shooter's cellphone as he arrives at a press conference, regarding the December 2019 shooting at the Pensacola Naval air station in Florida, at the Department of Justice on Jan. 13, 2020.Andrew Caballero-Reynolds / AFP - Getty Images

The federal government is once again “asking” Apple to break into an encrypted iPhone that was once owned by a now-deceased murderer and which contains what officials insist may be vital information — information that, without Apple’s technical intervention, will remain hidden from investigators’ view.

The request — it’s really a demand, and it's one that the feds have made before — can only be granted by Apple if the company makes all of its iPhone customers less safe.

To its credit, Apple has resisted this demand, as it has on similar occasions in the past.

The government's current request is related to the Saudi Arabian cadet who was training with the U.S. military in Pensacola, Florida. He killed three people and wounded eight more before he was shot and killed, according to official accounts. Apple, in response to requests from law enforcement, turned over the cadet's data it could retrieve from both its iCloud service and his online transactions.

The government — as it has in the past — wants more. In a complex ecosystem of hardware, software, storage and communications, officials want Apple (and Facebook, Google and every technology company that provides encrypted devices and software) to build what technologists call “back doors into those products and services.

What the government wants, in simple terms, is for technology companies to make sure their encryption methods are fundamentally flawed, in ways that they know and can share with law enforcement. Law enforcement’s goal is to have corporate and/or government access into anyone's cellphone or communications stream — such as iMessage or WhatsApp — without the permission (or in some cases the knowledge) of a phone's owner. Officials seemingly believe (despite ample evidence to the contrary) that if companies are secretive about how they create ways to circumvent their own encryption, no hacker or other bad actor will ever figure it out and thereby get their own access to people's data.

Law enforcement officials — currently led by Attorney General William Barr, who lashed out at Apple a few days ago — are saying, in effect, that making every American's mobile phone and communications apps vulnerable to any hacker in the world is the only way to prevent law enforcement from “going dark.” That’s shorthand for being unable to understand what criminals are saying to each other and what information they are storing for safekeeping (if any) — which they understandably call a scary scenario, as if the mass vulnerability to hackers of Americans' devices is not.

The Trump administration isn’t the first to try to create a special way to monitor cellphones. The modern encryption wars began back in the 1990s, when the Clinton administration proposed – and then, confronted with reality, dropped – putting a special chip in mobile phones that would, on demand, give the government access to any communications conducted on it. Technologists pointed out then that such a chip would make all cellphone much less safe, because criminals and foreign governments would surely find a way to turn on the chip to listen to whatever conversations they wanted as well.

Until the Pensacola case, the most furious debate over whether technology companies should grant governments access to everyone's cellphones took place after the 2015 attack in San Bernardino, California, in which 14 people were shot by two terrorists who were later killed by police. The Obama administration wanted Apple to break its technology to allow investigators further insights into the attack. and Apple refused, even defying a court order while it challenged the order. That case was made moot when the government got access by hiring hackers to exploit an existing weakness in Apple’s phone software.

Using hackers to exploit existing weakness has, in fact, been the way that governments — and, presumably, criminals — have gotten into individual phones and other devices in one-off cases. What they have not been able to get (which only Apple and other technology companies could but have refused to provide) is access to all users’ data upon demand. This is largely because Apple, as it always does, repaired the weakness exploited by the government-hired hackers after the 2015 attack and strengthened its customers’ ability to keep their private data as private as possible — even from Apple itself.

This is a key point: Apple works hard to ensure that even it doesn’t have the ability to break into devices it has sold to consumers. (That is one reason I recommend iPhones to journalists dealing with whistleblowers, and others involved in potential risky work.)

Apple's willingness to make sure that it is locked out of its customers' devices gets to the crux of a genuine security dilemma — which people like Barr continue to treat as a mere roadblock to robust law enforcement. Strong encryption is a yes-or-no matter: If it’s strong, and if there are no other weaknesses someone can exploit, it cannot be broken (at least in the near future). But, if you create the back door the government wants — purportedly to be used only by the “good guys” — you by definition create a weakness that can and will be exploited by the bad guys, just like a physical back door can be breached by a person dedicated to breaking into a building. This isn’t an opinion; it is mathematics, as encryption experts can attest.

That hasn't, however, stopped officials from ordering technologists to generate magic equations that break the basic laws of mathematics — which is what members of the Senate Judiciary Committee are threatening to do. In a hearing last month, Sen. Dianne Feinstein, D-Calif., a former prosecutor who often demonstrates fundamental contempt for civil liberties, teamed with Sen. Richard Burr, R-N.C., to warn the tech industry that if it didn’t find a way to provide the good-guys-only method of breaking into Americans' phones, Congress would order it to do so.

But the reality is that if Feinstein, Barr and the Trump administration get their way, they will ultimately compromise the safety of digital communications for almost everyone — not just the people who commit crimes.

And even if they could bludgeon U.S.-based companies into making communications and storage less secure for American consumers, their powers do not extend past our borders — but people's access to apps does. Would they then outlaw Americans' use of software written elsewhere, or products created collaboratively via the “open source” method and which weren't deliberately made weak?

Law enforcement has always operated under some restraints in the United States of America; they're built into our Bill of Rights, for the protection of all Americans. The ability to own communication devices that don't provide admin access to every law enforcement agency in the country is currently one such right — and it should remain one, for all our sakes.

Editor's note: Facebook is a donor to News Co/Lab, at which the author works.