YouTube Will Police Political Videos More Closely

The social media site says it will remove manipulated videos and content that promotes conspiracy theories, marking a contrast with Facebook.
An angry man in a suit
InfoWars' Alex Jones has pushed conspiracy theories about the Newtown school shooting, among others. Photograph: Andrew Harrer/Bloomberg/Getty Images

As the 2020 election begins in earnest with Monday’s Iowa caucuses, YouTube is cracking down on election-related disinformation in an attempt to protect users. Google’s video-sharing site has long had policies barring deceptive practices and engagement manipulation, but in a blog post on Monday, the company specified that the rules would also apply to a number of popular political conspiracy theories that have plagued the platform for much of the past decade.

So-called “birther” videos that argue “a candidate is not eligible to hold office based on false information about citizenship status requirements to hold office in that country” will no longer be permitted, YouTube noted, in a clear nod to the outsized role the platform played in advancing conspiracy theories about former president Barack Obama’s citizenship status in 2008 and 2012. (Prominent proponents included Donald Trump, who made headlines in 2012 with a YouTube video calling for Obama to release additional records to verify his citizenship.)

YouTube said it will also bar videos that have been manipulated or doctored to deceive users, including content that “has been technically manipulated to make it appear that a government official is dead.” That rule would have removed videos supporting conspiracy theories such as one that took root last year suggesting that Supreme Court Justice Ruth Bader Ginsburg was secretly dead.

It’s a sign that YouTube is doubling down on its comparatively aggressive stance on political misinformation at a time when its social media bedfellows have attracted criticism for their hesitance to do the same. Last May, when a misleading, doctored video of House Speaker Nancy Pelosi began making the rounds on social media, YouTube, notably, was the first major platform to take it down, while Facebook, contentiously, chose to leave it up.

The two sites also diverge on their approach to lies by political figures. Facebook has said it will not remove ads by politicians, even when they contain untruths. YouTube doesn’t permit politicians to blatantly lie in ads, though it does allow for the occasional “political hyperbole” so long as it doesn’t significantly undermine trust in democracy, the company noted in December. Both companies say they will prohibit content that spreads misinformation about the upcoming census or voting processes.

YouTube took a page from Twitter—which banned political ads outright last October—when it announced in December that it would no longer allow campaigns to micro-target voters with political ads. Despite calls from critics, Facebook has yet to do the same.


More Great WIRED Stories