Future Tense

Facebook’s So-Called Pivot to Privacy Is a Diversion

Zuckerberg’s new pledge to create a “privacy-focused platform” sure seems to take a self-servingly narrow view of privacy.

Mark Zuckerberg with a shadow behind him.
Mark Zuckerberg. Photo illustration by Slate. Photo by Bertrand Guay/AFP/Getty Images.

Mark Zuckerberg would like you to think that winter is coming for Facebook’s privacy invasive past. In a 2,237-word manifesto published on Facebook on Wednesday, Zuckerberg wrote that his company would be pivoting toward a “privacy-focused messaging and social networking platform” more interested in being the “digital equivalent of the living room” than “the digital equivalent of a town square.”

The news comes about a month after it was first leaked that Facebook was in the early stages of integrating the messaging services of its disparate platforms—Facebook proper, Instagram, and WhatsApp—to create an interoperable, encrypted messaging system by the end of 2019 or early 2020. The logic behind the push, as Zuckerberg puts it in his post, is that “people should be able to use any of our apps to reach their friends, and they should be able to communicate across networks easily and securely” since “many people prefer the intimacy of communicating one-on-one or with just a few friends.” Zuckerberg says that privacy would be paramount in this digital living room, so much so that he mentioned the word privacy (and derivations thereof) 50 times in this most recent note.

On its face, some of his plans are positive. Ephemeral messaging, already deployed in Instagram, can give people more confidence to share. End-to-end encryption, already used on WhatsApp, helps to ensure that the only people who can decipher a message are the sender and receiver. Indeed, just giving privacy a seat at the table may seem like a good start for Facebook, whose track record protecting our data is dubious at best.

And yet, I’m concerned. Zuckerberg’s post isn’t just a way to gin up interest in some new Facebook features. It’s a diversion, a magician’s misdirection full of red herrings. When it comes to privacy, Facebook has been getting into trouble, deflecting, apologizing, and failing to deliver on promises of meaningful privacy protections for more than a decade. And its CEO wants to distract us from that record with a few well-placed changes so we miss his dangerous inaction elsewhere. Even taking him at his word—a generosity Facebook certainly hasn’t earned—Zuckerberg’s essay shows that he fundamentally misunderstands what “privacy” means. Read more cynically, the post seems to use a narrow definition of the concept to distract us from the ways Facebook will likely continue to expand its invasion of our digital private lives for profit.

In his writing, it seems when Zuckerberg thinks about privacy, he thinks about encryption. He talks about people “interact[ing] privately” with a “shift to private, encrypted services” where “[p]eople’s private communications should be secure.” But privacy is not the same as encryption. Encryption is about making data impossible for an outsider to read and understand. Privacy is a far broader concept, covering not only the flow of information among individuals and groups, but also personal, intellectual, and sexual autonomy, and the trust necessary for social interaction. In practice, privacy is about limiting data collection, placing restrictions on who can access and manipulate user data, and minimizing or barring data from flowing to third parties. Zuckerberg mentions none of that in his essay. When he talks about encrypting the messages users send to prevent “anyone—including [Facebook]—from seeing what people share on our services,” he neglects to mention that Facebook will still be able to collect the metadata from these messages, like who individual users message and when. When he talks about interoperability, he glosses over whether the merger may require users to give up anonymity they may have on WhatsApp to comply with Facebook’s real name requirements. When he talks about a new digital living room, he conveniently leaves out the advertisers that will be invited into these spaces, too. And all the new ways platform connections will allow our information—profile data, messaging activity, clicks and hovers, interactions, GPS location, outside browsing history, and app use—to be used to help Facebook target ads in even more invasive ways.

What’s more, despite professing to pivot toward privacy, there’s no indication that, besides working to encrypt its merged messaging systems, the new Facebook would look any different from the old Facebook when it comes to these kinds of privacy-invasive practices. We can still expect it to surveil us wherever we go, even as we browse the internet outside Facebook’s ecosystem. Zuckerberg said in an interview with Wired that the curated, democracy-killing public sharing on the news feed isn’t going anywhere. It will still commoditize user data to make billions selling ads on Facebook, Instagram, and, soon, WhatsApp.

That these much more substantial privacy issues were not included in Zuckerberg’s vision isn’t surprising. Based on interviews and surveys I’ve conducted with current and former tech engineers as part of my research, I’ve found that many of them seem to mistake privacy for security. They consider privacy as little more than encryption or cybersecurity—protecting against unauthorized access to user data—because that’s what they learned in school and that’s what they were hired to do. This makes a certain sense, since security issues allow for solutions uniquely suited to an engineer’s skill set. Security is something that’s codeable.

But privacy involves bigger questions, like what user data is collected in the first place and how it should or could be used. Because of this, meaningfully creating “privacy-focused” systems requires technology companies to make choices from the start of the design process. Yet, since engineers alone are often tasked with coming up with privacy fixes, we end up getting technocentric fixes rather than wholesale change.

Zuckerberg’s narrow vision of privacy likely comes, in part, from this engineering worldview. But, given Facebook’s growth-at-all-costs history, I also think Zuckerberg knows the narrow definition of privacy he’s advancing offers the company a convenient smokescreen for what is, essentially, a profit-minded business move. Facebook, after all, is hemorrhaging users. Messaging services like WhatsApp are the fastest-growing social networks—and there’s little doubt Facebook is looking to use its merged messengers to globally dominate competitors in this space like WeChat, iMessage, Signal, Skype, and even Venmo.

Then there’s the other way publicly announcing their intention to pivot to a “privacy-focused social network” might benefit its bottom line: It might serve as an effective way of placating scandal-fatigued users with some evidence that Zuckerberg is actually taking action on our behalf. Messaging is one of the most visible parts of most people’s everyday experience online. Most of us regularly text, share photos, and participate in group chats. I have three group chats running simultaneously right now: one with professional colleagues, one with my family, and one with my boyfriend and his friends. We appreciate and understand this technology from a phenomenological, or physically experiential, perspective. When something changes about the way we send messages—like getting new options to send disappearing messages or getting a notification that our texts and images are now encrypted—we would likely see it, and notice.

But many of the ways Facebook will likely continue—or even expand—invading our privacy will remain out of view. Most of us would not notice if Facebook changed its metadata collection, restricted third-party access to our data, or limited advertisers from targeting us based on perceived race, ethnic origin, or gender expression. We wouldn’t see if Facebook stopped using Facebook and Instagram photos to feed its facial recognition A.I. Nor would many of us be able to explicitly detect the ways in which this integration and expansion of these messaging services might make it easier to piece together new kinds of data about us and make it harder to quit or break up the company when the inevitable scandal arises out of it.

This is not to say that adding end-to-end encryption is a bad idea; it is, in fact, a great start. But it runs the risk of diverting our attention from two important facts: We need a lot more action to address privacy across these platforms, and Zuckerberg’s purported pivot to privacy never mentioned any of it.

Maybe this is the beginning of a new era for Facebook. Maybe protecting privacy will come to be seen as a competitive advantage in the market and maybe Facebook will become a leader in that market by challenging the behavioral advertising model, cutting its surveillance footprint, setting defaults to private, ending the news feed, erasing old user data, and more. But Zuckerberg’s essay makes me think we are in for more of the same from Facebook for now: narrow, technocentric changes intended to distract us from a negligent and cavalier approach to protecting user privacy.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.