Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Mark Zuckerberg really doesn't want other governments to copy Germany's super strict 'Facebook law'

mark zuckerberg
Facebook CEO Mark Zuckerberg.
Mark Schiefelbein/AP

  • Facebook chief executive Mark Zuckerberg has said he is open to governments regulating tech companies during an interview with Wired.
  • But he says guidelines are better than micromanagement and criticised Germany's newly introduced "Facebook law" which dictates how fast tech firms need to remove hate speech and other offensive content.
  • He said developments in artificial intelligence meant everyone now expects tech firms to be quicker on the draw in tackling hate content — something that wasn't true when Facebook first started out.
  • Zuckerberg made the comments during an apology tour to say sorry for the massive Cambridge Analytica data scandal.


Facebook chief executive Mark Zuckerberg has indicated for the first time that he's open to governments around the world regulating tech — just as long as they don't copy the strict German model.

In an interview with Wired, Zuckerberg said: "[The] question isn't 'Should there be regulation or shouldn't there be?' It's 'How do you do it?'"

But he warned against governments micromanaging tech companies, and essentially dictating how exactly they handle privacy breaches, hate speech, and offensive content. He criticised Germany's Network Enforcement Act, introduced in January and nicknamed the "Facebook law"," which means tech firms must investigate hate speech complaints immediately, delete hateful content within 24 hours, or face up to €50 million (£45 million) in fines.

Here's what Zuckerberg told Wired:

"I think when you start getting into micromanagement, of 'Oh, you need to have this specific queue or this,' which ... is the German model — you have to handle hate speech in this way — in some ways that's actually backfired. Because now we are handling hate speech in Germany in a specific way, for Germany, and our processes for the rest of the world have far surpassed our ability to handle, to do that. But we're still doing it in Germany the way that it's mandated that we do it there. So I think guidelines are probably going to be a lot better. But this, I think, is going to be an interesting conversation to have over the coming years, maybe, more than today. But it's going to be an interesting question."

In short, Facebook is open to regulation as long as governments keep the guidelines fairly flexible and don't nitpick at every piece of offensive content the firm doesn't delete. "I think guidelines are much better than dictating specific processes," Zuckerberg said.

He even made a bizarre comparison with how the US regulates chicken:

"[My] understanding with food safety is there's a certain amount of dust that can get into the chicken as it's going through the processing, and it's not a large amount—it needs to be a very small amount—and I think there's some understanding that you're not going to be able to fully solve every single issue if you're trying to feed hundreds of millions of people—or, in our case, build a community of 2 billion people—but that it should be a very high standard, and people should expect that we're going to do a good job getting the hate speech out."

Even considering regulation this strict is only possible due to advances Facebook and other tech firms have made in using artificial intelligence to identify violating content, such as nudity or terrorist content.

Zuckerberg noted that when Facebook first started out, it still had a significant userbase but no one expected the company to be able to catch "something bad" every time it was posted. Now the picture is different: Facebook is a public company that last year made more than $40 billion (£28.2 billion) in revenue, and it's built AI tools that can detect lots of offensive content.

He said: 

"Now that companies increasingly over the next five to 10 years, as AI tools get better and better, will be able to proactively determine what might be offensive content or violate some rules, what therefore is the responsibility and legal responsibility of companies to do that? That, I think, is probably one of the most interesting intellectual and social debates around how you regulate this."

Zuckerberg made the comments during his apology tour on Wednesday, during which he publicly said sorry for the ongoing Cambridge Analytica data scandal.

You can read the full interview with Wired here.

Facebook Germany

Jump to

  1. Main content
  2. Search
  3. Account