Advertisement

Facebook's version of political neutrality isn't neutral

Its stance isn't working.

Yesterday, Facebook outlined how it will prevent the 2020 elections from being hacked and influenced in the same way they were in 2016. That includes crackdowns on "coordinated, inauthentic behavior" and securing the accounts of individual politicians and campaigns. But one thing that Facebook still won't do is make sure that the content of political ads on its site is truthful.

Under the new rules, Facebook pages will have to show their confirmed owner, with state-controlled media more clearly labeled. The amount of cash spent by each campaign will also be easier to track on a dedicated "candidate spend tracker." It will also ban any advertisements designed to discourage people from voting, or that voting, in general, is pointless.

But these things do not necessarily mean Facebook is now in lockstep with broadcast media, which has far tighter rules on what it can and can't do. Because the social network is not governed by the same laws that, perhaps enabled it to receive so much cash in 2016. According to TechCrunch, both campaigns spent around $81 million on Facebook alone.

In September, Facebook's head of policy said that the site was available for anyone to use, without moderation. "To use tennis as an analogy," said Nick Clegg," our job is to make sure the court is ready." "But," said the former British Deputy Prime Minister, "we don't pick up a racket and start playing. How the players play the game is up to them, not us."

And, at the risk of writing a defense of both Facebook and Clegg, it's a perfectly valid stance. As a private company, not currently under much regulation, it decides what speech is acceptable on its platform. And, beyond an acceptable use policy, if it chooses to allow any other form of speech, then it is well within its right to do so.

If Facebook started getting regulating political ads, it would leave itself open to a number of questions about how, and why, it uses its power. Not to mention that those critical of Facebook are asking it to somehow solve a problem that has dogged politics for centuries. How do you define a lie in politics, and is it possible to do so at the time that the contentious statement has been made?

In 2009, the UK's Conservative Party campaigned against something it called a "Jobs Tax." It was a slogan against a planned one percent rate rise in national insurance contributions, covering the country's healthcare and pension provisions. Conservatives said that the "Jobs Tax" would destroy the economy, kill off growth and cause mass unemployment.

It was surprising, then, when the conservatives, then in power, implemented the same thing a few years later. Was that a lie that the party used to get into power, and if so, should it be punished for doing so? How would you police such a rule? These are the questions we're expecting Facebook to answer right now.

As well as simply regulating the content of political speech, Facebook would also be tasked with identifying the limits of political speech. What's the line between you complaining about your local transit company or healthcare provider and making a political statement? And do you want Mark Zuckerberg and his employees making those decisions?

Facebook also has faced criticism (when doesn't it) for apparently playing favorites with media outlets in other fields. A Financial Times report claims that a new news tool it's currently building will pay Bloomberg and Dow Jones for content, but not Reuters or the Associated Press. Outside of Facebook, the decisions seem arbitrary, picking winners and losers without any discernible logic.

Facebook cannot be an entirely neutral party. It already makes plenty of decisions about permitted speech, as its lengthy use policies make clear. It already has definitions about what people can, and cannot, say about others, and those policies have adapted over time. It has an army of fact checkers and moderators at its disposal and could, very easily, apply this to the sort of campaign ads it has chosen to exclude.

And other media outlets, like CNN, have already refused to run ads that contain "demonstrably false" assertions. But behaving CNN would leave Facebook open to the risk of regulation, something it seems intent on ducking. Cynically thinking, the company would also be leaving a lot of money on the table if it turned down these ads.

It's not outside of Facebook's power to verify the content posted by campaigns right now. And it would be easier if Facebook limited the number of ad-buying bodies to a small group of verified names, like the national party and the candidate itself. Not only would that make it easy to moderate, but it would prevent malign third parties from getting involved.

But that would undermine Clegg's "tennis court" metaphor, where Facebook presents itself as a neutral party. But if one party is prepared to cheat and another is not, then refusing to act will seem like Facebook is taking a side.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.