Editor’s Note: Andrew Grotto directs the program on Geopolitics, Technology and Governance at Stanford University, and is a visiting fellow at the Hoover Institution. He served as the senior director for Cyber Policy on the National Security Council in the Obama and Trump White Houses. The opinions expressed in this commentary are his own.

Despite pressure from President Donald Trump and Attorney General William Barr, Apple continues to stand its ground and refuses to re-engineer iPhones so law enforcement can unlock the devices. Apple has maintained that it has done everything required by law and that creating a “backdoor” would undermine cybersecurity and privacy for iPhone users everywhere.

Apple is right to stand firm in its position that building a “backdoor” could put user data at risk.

At its most basic, encryption is the act of converting plaintext (like a credit card number) into unintelligible ciphertext using a very large, random number called a key. Anyone with the key can convert the ciphertext back to plaintext. Persons without the key cannot, meaning that even if they acquire the ciphertext, it should still be impossible for them to discover the meaning of the underlying plaintext.

For iPhones, the user’s PIN for unlocking the phone generates the encryption key for the phone’s data. While the phone is locked, the data is encrypted. Ten failed attempts at unlocking the phone causes the phone to wipe the key, rendering the data on the phone unrecoverable. Apple does not record users’ PINs, so once a user’s PIN is lost, so is the data on their phone unless the data is backed up in iCloud or elsewhere. This means that a court-ordered search warrant is a necessary, but not always sufficient, requirement to access data on these devices; to access iPhone data, you must also have the user’s PIN.

That is the crux of the dispute between Apple and the Trump administration: The administration wants Apple to reengineer its operating system so that it can access the data without knowing a device’s PIN. President Obama’s Justice Department tried to force changes on Apple in 2016 in the aftermath of a terrorist incident in San Bernardino, California, but dropped the effort after reportedly paying a company more than $1 million to hack the suspects’ phones.

Law enforcement’s argument that warrant-proof communications could allow bad actors to get away scot-free is valid. The fact is, the more widely deployed strong encryption is, the more likely it is that communications of interest to law enforcement will simply be out of their reach.

But that fact alone does not settle it, because it is also true that requiring Apple to reengineer its products to provide law enforcement with access carries its own set of risks. As Apple and others note, there is no way to engineer law enforcement access without introducing risks that bad actors could exploit for their own criminal purposes.

Much of what makes encryption such a hard policy problem is that it boils down to a lesser-of-two-evils choice: Do we want more criminals using encryption to get away with crimes, or more bad actors exploiting the vulnerabilities made possible by creating “backdoors” for law enforcement? We know that both choices trade one risk for the other, but we don’t have hard data on the magnitude of those risks. This makes a cost-benefit analysis of the two positions difficult.

There is one consideration, however, that tips the balance — at least for now. If Apple were required to reengineer its products to enable law enforcement access, it could spark a cat-and-mouse game between law enforcement and those who wish to thwart them.

A cottage industry of third-party encryption applications for iPhones and other mobile devices would sprout up overnight if Apple were compelled to build backdoors. The developers behind these products could be anywhere in the world, and many of them will ignore US laws demanding that they facilitate law enforcement access. What then? Do we ban such apps? How do we enforce the ban? Do we criminalize the possession of these apps and products?

We have a tradition in the United States of setting a high bar for authorizing governmental interventions, with the burden of proof falling squarely on those advocating for the intervention. That burden has not yet been met.