Facebook’s Flawed DNA Makes It
Unable to Fight Misinformation

Frederic Filloux
Monday Note
Published in
7 min readAug 20, 2018

--

by Frederic Filloux

Photo by Randy Colas on Unsplash

The social network is built on values that are so shady that it can’t be trusted to address fake news issues. Some countries already suffer from it. (This is part 2 of a series on misinformation. Part I is here).

When it comes to fighting misinformation, Facebook doesn’t have a technical problem. It has a problem of will and resolve, which is deep-rooted in the questionable set of values the company is built upon. The good news: it can be reversed. The bad news: not with the current management of the company.

Let’s take a closer look.

1 . The absolute leader syndrome

In fall 2016 at Stanford, I took a class on international politics given by Francis Fukuyama, titled Democracy, Development and the Rule of Law (which is also the name of the institute Fukuyama runs at Stanford). One of the lectures was about the inner structures of semi-democracies, authoritarian regimes, and totalitarian systems.

All of them share the same building blocks:
— Strong ideology
— Hyper-centralized and tightly held leadership
— Long-term vision
— Cult of Personality
— Will of controlling all aspects of the society
— Little tolerance for any dissenting opinion, suppression of any internal opposition.

Listening to Fukuyama and his guest lecturers developing these ideas, I couldn’t help but think about Facebook. When looking at the social network, we have all the ingredients of an authoritarian system:

— The company professes the ideology of absolute transparency (except for its leaders) with the mantra “We connect people” being deployed at the cost of people’s privacy.

— Mark Zuckerberg is the embodiment of a highly centralized power. As anecdotal it sounds, you can feel it when walking through the premises of the company in Menlo Park, California. Almost at the geometric center of Building #20 sits the desk of the leader and, and twenty feet away, his glass-walled conference room. It’s like watching the spider in its web. The feeling is awkwardly enhanced by a communication person saying that “Mark” wanted this setting as a reminder of the necessary transparency, and by the way, she says, the unfinished aspect of the building is done on purpose: “Mark wants us to remember that the company is at 5 percent of what it will be” — scary, when you think about it.

— Facebook’s cult of Zuck is not accidental. It is actually staged. When the CEO is visiting “flyover country”, he has a former White House photographer trailing behind him. Entertaining the mystique of the threatened visionary (precisely an autocrat’s fixture), the company leaks the amount it allegedly needs to spend to protect its CEO (ten million bucks a year). It is difficult to understand what makes Zuckerberg so different than Bill Gates at the height of his power, Apple’s Tim Cook or Google’s founders and current CEO Sundar Pichai. All of them are global leaders who don’t need nor seek such attributes of power.

— There is no doubt that Mark Zuckerberg engineered his grip over his company. Retaining 16 percent of the shares but 60 percent of the voting rights, and cumulating the title of chairman and CEO, makes him the undisputed ruler of his world (which is, unfortunately, no longer a metaphor). A year ago, Zuck even tried to get more when he asked for the issuance of Class-C shares that would have granted him an ad vitam control over the company, even if he sells 99% of his holdings. After some uproar, he balked. (About the “pros and cons of dictatorship” in corporate governance, read this interesting analysis by Stefan Petry, a finance lecturer at Manchester University, in The Conversation.)

— As for control of every aspect of society, it is ingrained in Facebook’s fabric. The high-performing advertising machine is hunting down every glimpse of our privacy. Over the last years, Facebook has built the ultimate surveillance apparatus and this is far from over; just look at Facebook’s know-how in facial recognition or its recent attempt to access our banking data or the finally admitted tracking of non-Facebook users. We’ll never be able to say that we didn’t know.

— This hyper-centralized culture has tangible consequences in the decision-making process of the company. In Facebook’s system, all the country delegations are systematically deprived of any initiative, almost to a comical point: each time a sensitive issue is raised in a meeting outside Menlo Park, execs will quickly shield themselves behind the necessity of reaching out to headquarters. It is worth remembering that five years ago, Google had exactly the same flaw. It has since corrected it by giving more power to its local representatives to work more independently with business partners. In doing so, Google made tremendous gains in speed and efficiency.

2 . Combine the Ford Pinto Syndrome + the “R.O.W.” Factor and you get Facebook Misinformation Policy

For those who don’t remember, in the ’70s, Ford sold a car whose fuel tank had the propensity to explode in the event of a rear-end collision. Instead of quickly doing a complete recall of the Ford Pinto, some beancounters in Dearborn, Michigan, found out that it would be less expensive to deal with lawsuits resulting from accidents than making the necessary change on the cars. (The best read on that matter is 1978: Ford Pinto Recall, in this abstract of Engineering Ethics: An Industrial Perspective).

When it comes to fighting misinformation Facebook is making exactly the same kind of calculation, augmented by the R.O.W. factor. In Silicon Valley jargon (and in most American accounting practices), the Rest of the World item encompasses countries that do not belong to the big money- making league (North America, Europe, China).

For good measure, Facebook added another sub-segmentation to the R.O.W. group, that is markets that are really insignificant in terms of ARPU (Average Revenue per User), like Myanmar. Olivia Solon, the San Francisco correspondent for The Guardian, put it in rather harsh terms when she recounted last week a recent phone briefing with Facebook on the Myanmar deadly blooper :

“This is the latest in a series of strategic mishaps as the social network blunders its way through the world like a giant, uncoordinated toddler that repeatedly soils its diaper and then wonders where the stench is coming from. It enters markets with wide-eyed innocence and a mission to “build [and monetise] communities”, but ends up tripping over democracies and landing in a pile of ethnic cleansing.”

For more on the matter, also read this compelling report by Steve Stecklow from Reuters:

OK. Most of the readers of the Monday Note (70 percent of you are in the US, folks) don’t give a rat’s ass about Myanmar and the Rohingya. But let’s think about it in a different way. Countries like Myanmar or the Philippines are the model Facebook wants to impose on the world (especially the developing one): become the de facto internet by subsidizing access for the poor.

With such grandiose ambitions might come great responsibilities, but Facebook doesn’t care.

A year ago at a conference in Sydney, I met a woman I admire enormously. Maria Ressa is the courageous founder of The Rappler, the largest independent website in the Philippines (a nation of 106 million people of which 70 percent are on Facebook.) Applying rigorous journalistic standards, Ressa and her team relentlessly stand up against the abuses of Filipino President Rodrigo Duterte. She explained to me how she and her newsroom were harassed by the dictator’s supporters who called to “rape [her] to death”, using Facebook ad nauseam. Last December, the journalist Lauren Etter published this excellent piece in Bloomberg.

In many instances, Maria met with Facebook execs (including Zuckerberg himself) and sent countless emails like this one, quoted in the Bloomberg’s article:

“Please take a closer look at the Philippines. While you’ve taken action in Europe, the danger is far worse for us, and Facebook is the platform they use to intimidate, harass, and attack. It is dangerous. I fear where this may lead. Best, Maria.”

In reply to her request for comment, the reporter from Bloomberg got this from Facebook:

“We are committed to helping ensure that journalists around the world feel safe on Facebook as they connect their audiences with meaningful stories. We permit open and critical discussion of people who are featured in the news or have a large public audience based on their profession or chosen activities, but will remove any threats or hate speech directed at journalists, even those who are public figures, when reported to us.”

Facebook’s DNA is based on the unchallenged power of an exceptional but morally flawed — or at least dangerously immature — leader who sees the world as a gigantic monetization playground. In Mark Zuckerberg’s world, the farther from home, the more leeway he feels to experiment with whatever comes to his prolific mind. Yielding on the Ford Pinto syndrome, he feels little incentive to correct the misuse of the tools he created. And he managed to have no one standing against him.
• • •
In a future Monday Note, we’ll look at an objective constraint Facebook is facing when it comes to fighting misinformation: whatever the measure taken, it will collide with the company business model. Tough choices need to be made.

frederic.filloux@mondaynote.com

--

--