BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

If Apple Can Create A Backdoor To The iPhone, Could Someone Else?

Following
This article is more than 8 years old.

On February 16, U.S. Magistrate Judge Sheri Pym issued an order compelling Apple to assist the government in bypassing the security features of an iPhone 5C used by one of the perpetrators of the December 2015 San Bernardino attack. Apple quickly responded with a lengthy statement from CEO Tim Cook calling the order a “dangerous precedent.” It “would be wrong,” asserted Cook, “for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”

Much of the press coverage has been focused, logically enough, on what is likely to be an epic legal tug of war pitting the government against Apple—and by implied extension, a significant portion of the American technology sector. The long-running policy debate about government-accessible “backdoors” has suddenly taken on an extremely high profile, here-and-now urgency.

The legal issues raised by the order are of paramount importance. But there is another aspect of this story that deserves far more attention than it is getting: Apple’s statement did not assert that the company is unable to help the government. Instead, Apple is primarily arguing that the government order to compel it to help is improper. The choice not to highlight technological impediments to following the court order implies that, legal issues aside, Apple might in fact be capable of building a backdoor—or at least of putting enough cracks in the wall so that the government could break through it.

The obvious next question is: If Apple could help build a backdoor to the iPhone, could someone else also do it? When it comes to compromising the iPhone’s security, Apple would clearly have an enormous technological advantage over an outsider trying to do it. After all, Apple designed and built the iPhone. Much of the information needed to compromise its security presumably resides in the form of tightly held trade secrets that only Apple knows.

In short, outsiders trying to bypass the iPhone’s security measures would face some very high obstacles. But hackers can be ingenious and determined. And a group or nation-state unconcerned about the resulting legal implications might try to employ any number of methods—including hacking into Apple’s corporate systems, finding a rogue current or former Apple employee, or plain old reverse engineering—to get information that might make it possible to create a backdoor.

The court order and Apple’s response also raise an interesting and important question about whether we should rethink the very definition of backdoor. If a product is designed in a way that would make it possible to create a backdoor, even though none has yet been created, does that mean that in some sense a backdoor already exists?

And, there is a further point: Opponents of government-mandated backdoors often argue, with good reason, that relying on government goodwill alone would result in far too little privacy protection. But, for some of the same reasons, isn't relying on company goodwill problematic as well?