Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Don't Crack the iPhone: Apple Is Not the FBI's Tech Support

Apple is right to push back against government efforts to undermine iPhone security.

By Max Eddy
February 19, 2016
Apple iPhone 6s Plus

This week, the notoriously secretive Apple went up against the FBI when the agency requested that the company help break open an iPhone 5c used by the San Bernardino shooters. Instead, Tim Cook released a letter stating publicly that Apple believed creating a special tool to disable security features on iPhones would set a dangerous precedent. And he was right to do so.

Knock, Knock
To be clear: The FBI is not specifically asking that Apple provide an always-accessible backdoor into everyone's iPhone. What it wants Apple to do is construct a special version of iOS that would bypass the time delay required by iOS between failed attempts and disable a setting that wipes iPhones after 10 failed attempts. This would allow agents to brute force—or try lots and lots of wrong passwords until stumbling across the right one—the phone's passcode. (It's been suggested that creating a super-long passcode might slow down that process.)

Opinions The argument that law enforcement and the intelligence community are responsible enough to handle powerful tools like these has been around for a long time. The FBI and others have talked about the risk of "going dark," where communications will be carried out via encrypted services that are inaccessible to investigators or surveillance tools.

The old adage is that a backdoor for the good guys is a backdoor for the bad guys; the safest way to keep people out is to not give them a way in. It's the argument that Co-Founder and Co-Chairman Nico Sell made when she was approached by FBI agents to put a backdoor in her secure messaging service, Wickr. But it's far from just a hypothetical argument.

Broken Locks
Take TSA-compliant luggage locks. When you buy one from the store, it's designed to accept one of several possible master keys in the hands of TSA agents. The idea is that this allows the right people—the TSA inspectors—to open your luggage without having to cut off locks and then safely lock the baggage again. Only you and the inspectors should be able to open it.

It's a nice idea, but it only works as long as sole access to the master keys is restricted to the right people. The good guys. But these keys were posted online and made into 3D printable objects, providing access to everyone: good guys and bad guys.

It's this scenario that Apple cites as its primary reason for fighting the FBI's court order. If Apple created a special version of iOS and used it to allow the phone in question to be unlocked, it might not stay under the company's control for very long. If it got loose, it could undermine the hard work Apple has put in developing a smart, secure phone. If it exists at all, Apple could be compelled to use it again, and again, and again.

To be fair, Apple already spends a good deal of time and effort responding to court orders and investigators' requests. The New York Times reports that the company handed over the shooter's iPhone backup files that were stored on iCloud. When PCMag recently looked closer at encryption and how Apple stores our information, we found that as long as it is stored on Apple's servers it is potentially readable. But Apple is making it clear that it is only willing to go so far, and developing custom intrusion tools for the FBI is apparently the limit.

Security for All
As our devices become more and more personal, it is no surprise that they'll be targeted by law enforcement and intelligence agencies. But that's no excuse for the FBI, or anyone else, to weaken existing security tools and claim that digital privacy is the exclusive realm of those in power. Which is, effectively, what it's doing. 

Back in 2014, FBI director James B. Comey addressed the crowd at the RSA Conference. When it came to surveillance and searches, particularly of the digital kind, he said "Our goal is to be surgical and precise in what we're looking for, and do whatever we can to protect privacy rights and competitive advantage." Building a magic key for iPhones, or preventing the widespread use of encryption, would do neither.

If the FBI wants to get into an iPhone, or any other secure device, it can develop the technology themselves. Security experts are often telling me that if someone wants to break into a phone and has physical access to it, they will eventually succeed. I'm confident that if the FBI rolled up its sleeves, it would get what it's looking for. If accused murderer and bath salts enthusiast John McAfee thinks he can pull off cracking an iPhone, surely the FBI can, too.

Like What You're Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Max Eddy

Lead Security Analyst

Since my start in 2008, I've covered a wide variety of topics from space missions to fax service reviews. At PCMag, much of my work has been focused on security and privacy services, as well as a video game or two. I also write the occasional security columns, focused on making information security practical for normal people. I helped organize the Ziff Davis Creators Guild union and currently serve as its Unit Chair.

Read Max's full bio

Read the latest from Max Eddy