Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Execs Grapple With the 'Responsibility' of Facebook

At the Code Conference, Sheryl Sandberg apologized again for Facebook's late response to the Cambridge Analytica debacle, but said the issue now is making sure it doesn't happen again.

May 30, 2018
Sandberg, Schroepfer

Facebook COO Sheryl Sandberg and Mike Schroepfer took the stage at the Code Conference in Rancho Palos Verdes last night to discuss changes the company has made in the wake of the Cambridge Analytica scandal.

Interviewed by conference moderators Kara Swisher and Peter Kafka, Sandberg said the company now understands it was too late in responding to the privacy concerns raised by Cambridge Analytica. "We definitely know we were late. We said we are sorry, but sorry isn't the point," she said.

Instead it's important to think about the responsibility Facebook has in a different way. For the past 10 to 12 years, according to Sandberg, Facebook focused on building and enabling social experiences, sometimes neglecting to consider how the platform could be misused or abused. "Now we are understanding the responsibility we have and trying to act accordingly," she said.

There is a "fundamental tension" between tools that that allow for easy, free expression and keeping people safe, Schroepfer added. Facebook wants to facilitate discussion, but also make sure the platform doesn't host hate speech or posts designed to manipulate elections.

The Cambridge Analytica Problem

The Cambridge Analytica issue dates back at least 10 years, when people were talking about wanting to "take data with them," so Facebook developed APIs to help them do so. In those days, Schroepfer said, Facebook was optimistic and focused on the fact that entrepreneurs could use its data to develop new applications. It also thought people who used those apps understood what was happening.

By 2014, Facebook decided to restrict access to such data, and started a more proactive review of applications. Cambridge Analytica had gotten ahold of Facebook data. Why did Facebook learn about this from the press? Once the data was outside Facebook, it could only observe the data, Schroepfer said.

Facebook immediately disabled the app that scraped the data, and tried to figure out who accessed it. After zeroing in on Cambridge Analytica, the firm insisted it had deleted the data, but that might not be the case, Schroepfer acknowledged.

Kafka, Swisher, Sandberg, Schroepfer

The company is now focused on theoretical ways people could get ahold of user data, he said, and has made investments in security, content review, and development.

Looking back, "we wish we had more controls in place," Sandberg said. She noted that despite legal assurances from Cambridge Analytica that it had deleted the data, "we should have audited them." She said that in recent months the company has moved to do just that, though this is currently on hold pending a UK government review

In the run-up to the 2016 election, people were mostly worried about spamming and phishing emails, Sandberg noted. Though the company took steps to avoid those problems, it didn't anticipate the different, "more insidious threats" that were coming. It is now very aware of these threats, and has taken aggressive action in this area, Sandberg said.

Sandberg pointed to the deletion of fake accounts and Facebook work with governments to help prevent similar occurrences around other elections, citing work around Alabama, Germany, and France. "We are showing that we are taking steps to make it better," she said.

Sandberg also mentioned that Facebook has "always had" tools to control how users share data with applications, and has now moved these tools to the top of the News Feed. The company is also building new tools on top of these controls.

Didn't See it Coming

Swisher asked how Facebook could possibly have failed to understand the potential for the misuse of its platform, and talked specifically about missteps with Facebook Live. Sandberg pushed back, and said that "Live is a great example" of how the company fixes things. She noted that when Live launched, there was "a lot of good, [as well as] things that were wrong." So now the company has human review of anything live within minutes. As a result, there have been posts taken down right away, and times when the company intervened and helped people.

Facebook has an open platform, and knows it will never prevent all the bad things. But she said the company could be more transparent and put more resources into making a safe community. The company has deleted 1.3 billion fake accounts; published its internal guidelines used to judge whether content should be taken down; and successfully removes 99 percent of terrorist content, 96 percent of adult photos and sexual content, but only 38 percent of hate speech before it's reported to the company by users.

"We won't get it all," Sandberg admitted, but Schroepfer said Facebook has made more progress on this than he thought it would be able to.

Fake News

On the problem of fake news, Sandberg said much of that comes from fake accounts; by taking those down, it reduces the problems. Another big source is economically motivated, so the company is moving to kick bad actors out of its ad networks. She also said the company is working on being more transparent, so you can see the people behind any political or issue posts, which allows people to find more things that are wrong and report them.

Regulation

Asked about regulation, Sandberg said the company is already regulated with things like GDPR. "The question isn't if there will be more regulation, but what kind of regulation," she argued.

Facebook has spent a lot of money and put in a lot of complex systems to handle GDPR, and acknowledged that regulation can entrench big companies. And she worries about unintended consequences, noting that things like Caller ID were originally considered an invasion of privacy, so there was regulation preventing it.

Asked if Facebook is a monopoly and should be broken up, Schroepfer said there's competition in the market, citing YouTube for video sharing, Twitter for posting public comments, and Snapchat, WeChat, and iMessage for messaging. "Consumers use the products they want," he said, noting Facebook is "a very small part" of the overall advertising market.

Apple vs. Facebook

Asked about Apple CEO Tim Cook's criticisms of the company, Sandberg said, "We strongly disagree with their characterization of our products and business model," noting that as a free service, Facebook is available to people all over the world.

"We've looked at subscriptions and will continue to do so," Sandberg said, but emphasized that the heart of the product will continue to be a free service.

Hearing about the terrible things that happen on the platform has made the company focus on new priorities, Schroepfer said. "It's not fun, but it's really important work." He also said the focus on safety and security is the "biggest cultural shift" he's seen at the company.

Though Facebook is focused on the need to provide safety, security, and integrity on the platform, the company "understand[s] it will be an arms race," and there will be risks it will not anticipate, Sandberg acknowledged.

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

Table of Contents

TRENDING

About Michael J. Miller

Former Editor in Chief

Michael J. Miller is chief information officer at Ziff Brothers Investments, a private investment firm. From 1991 to 2005, Miller was editor-in-chief of PC Magazine,responsible for the editorial direction, quality, and presentation of the world's largest computer publication. No investment advice is offered in this column. All duties are disclaimed. Miller works separately for a private investment firm which may at any time invest in companies whose products are discussed, and no disclosure of securities transactions will be made.

Read Michael J.'s full bio

Read the latest from Michael J. Miller