Facebook’s Encryption Makes it Harder to Detect Child Abuse

Opinion: The social network needs to develop better ways to help stop the spread of millions of harmful images.
mark zuckerberg
Zuckerberg has repeatedly expressed his desire to “get it right” this time. The technology exists to get it right.Photograph: Andrew Harrer/Bloomberg/Getty Images

In 2018, the National Center for Missing and Exploited Children received more than 18 million reports to their CyberTipline, constituting 45 million images depicting child sexual abuse. Most of these children were under the age of 12, and some were as young as a few months old.

Since its inception in 1998, the CyberTipline has received a total of 55 million such reports. Those from 2018 alone constitute a nearly half of all reports over the past two decades.

These staggering numbers don’t cover the entirety of online services. Most of NCMEC’s reports are automatically generated by an image hashing technology I helped develop called PhotoDNA, which extracts a distinct signature from uploaded images and compares it against the signatures of known harmful or illegal content. Flagged content can then be instantaneously removed and reported.

But not every online service uses PhotoDNA. And child sexual abuse material shared via the dark web, personal correspondences, and services that use end-to-end encryption generally don’t get reported to NCMEC or anyone else. Frustratingly, Facebook, the world’s largest social network, is set to grow the digital realm where images of child sexual abuse can spread freely.

Earlier this year, Facebook CEO Mark Zuckerberg announced that his company is expanding the use of end-to-end encryption on its services, preventing Facebook or anyone else from seeing the contents of communications. Zuckerberg conceded that this comes at a cost. “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things,” he said. “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.”

Broader adoption of end-to-end encryption would cripple the efficacy of programs like PhotoDNA, significantly increasing the risk and harm to children around the world. It would also make it much harder to counter other illegal and dangerous activities on Facebook's services. This move also doesn’t provide users with as much privacy as Zuckerberg suggests. Even without the ability to read the contents of your messages, Facebook will still know with whom you are communicating, from where you are communicating, and a trove of information about your other online activities. This is a far cry from real privacy.

Knowing that tens of millions of examples of the most heartbreaking imagery pass through its services every year, why would Facebook undermine the ability to prevent itself from becoming a safe haven for child predators?

The not so cynical answer is that Facebook is leveraging the backlash from its recent privacy scandals to launch a strategy that provides plausible deniability against the equally loud accusations that the company is not doing enough to suppress child abuse material, terrorist propaganda, crime, or dangerous conspiracies. By encrypting the content moving through, Facebook gets a twofer: It can claim to be ignorant of the abuse, while also telling the public that it cares about privacy. But neither one is true.

Many in law enforcement have argued that shifting to end-to-end encryption would severely hamper law enforcement and national security. The US attorney general, his British and Australian counterparts, and the 28 European Union member states have all urged Zuckerberg to delay the implementation of end-to-end encryption until proper safeguards can be put in place.

Facebook's move has reawakened the fraught debate over whether governments should have a way to pierce encryption. I argue that governments that operate under the rule of law should, with a warrant, be granted the same access to our electronic lives as they are our physical lives. Government overreach or abuse can be adjudicated by the courts, and Facebook can choose not to deploy its services in countries in which governments cannot be trusted.

We should continue to debate how to balance the incremental privacy afforded by end-to-end encryption and the cost to our safety. But even now, Facebook can protect our children at the same time as widening its use of encryption.

Recent advances in encryption and hashing mean that technologies like PhotoDNA can operate within a service with end-to-end encryption. Certain types of encryption algorithms, known as partially or fully homomorphic, can perform image hashing on encrypted data. This means that images in encrypted messages can be checked against known harmful material without Facebook or anyone else being able to decrypt the image. This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse.

Another option is to implement image hashing at the point of transmission, inside the Facebook apps on users’ phones—as opposed to doing it after uploading to the company’s servers. This way the signature would be extracted before the image is encrypted, and then transmitted alongside the encrypted message. This would also allow a service provider like Facebook to screen for known images of abuse without fully revealing the content of the encrypted message. Facebook would be wise to adopt either of these options.

We do not need to cripple our ability to remove some of the most harmful and heinous content in the name of an incremental amount of privacy. Zuckerberg has repeatedly expressed his desire to “get it right” this time. The technology exists to get it right. Facebook needs to now do what its leaders and everyone else know is the right thing: protect our children.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at opinion@wired.com.


More Great WIRED Stories