In a flabbergasting interview with Recode, a news website, Facebook CEO Mark Zuckerberg told Kara Swisher, Recode’s co-founder, that Holocaust denial would not be removed on Facebook. In the lengthy conversation, Zuckerberg attempted to explain why he feels it’s wrong to remove the anti-Semitic revisionism from the social media platform, even though he personally finds it “deeply offensive”:
“MZ: …at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think-
KS: In the case of the Holocaust deniers, they might be, but go ahead.
MZ: It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, ‘We’re going to take someone off the platform if they get things wrong, even multiple times.’”
What caused this deviation in the conversation was Facebook making headlines about freedom of speech just days before, after posting several replies on Twitter to users concerned about fake news on the platform. Facebook said that it would not ban pages for sharing conspiracy theories or false news.
“We see Pages on both the left and the right pumping out what they consider opinion or analysis – but others call fake news. We believe banning these Pages would be contrary to the basic principles of free speech,” one tweet explained. “We just don’t think banning Pages for sharing conspiracy theories or false news is the right way to go.”
This policy, in itself, is enough to create near universal outrage (especially in light of the role misinformation likely played in the 2016 US Presidential election.) Not banning pages for sharing what we’ve come to call fake news allows misinformation to spread far more easily on Facebook. But what’s more concerning than the spreading of misinformation itself is the dangerous precedent this policy sets when it comes to policing content on social media.
In his conversation with Recode, Zuckerberg also referred to an internal Facebook practice of “moving things down” rather than removing harmful content. This means that, when you sign onto Facebook, hate speech and misinformation may not be the first thing you see, but instead the fourth or fifth thing you scroll past on your news feed. Rather than having a global policy of offensive content removal, outright lies and conspiracies (eg claiming that school shooting Sandy Hook never happened and was a plot by the Obama administration) won’t show up immediately, or maybe you’ll see them less often.
What’s obviously dangerous here is that the choices are made entirely at Facebook’s discretion. Instead of maintaining a clear, straightforward system of allowing content to exist as normal or having it removed, Facebook is controlling what you see, when you see it, and how you see it it. The control over how millions of people get their news on a daily basis is now in the hands of an opaque, bespoke system run by a handful of people at Facebook.
But even setting aside that lack of transparency, the basic acceptance that Holocaust denial should have even a less visible place on Facebook is particularly worrying because there are few, if any, instances of misinformation that are so clearly dangerous. Holocaust denial is, and has been for decades, disseminated by anti-Semites has led, and ultimately could lead to, physical harm against Jewish people.
LBC radio presenter James O’Brien spoke to the New Statesman in February 2017 about the false equivalencies broadcasters pander to in the name of balance. “I worry that the next thing we’re going to look at through the lens of false equivalence might be Holocaust denial,” he said. “If that moves front and centre, then I think we’re all fucked.”
It seems that O’Brien’s hypothesis is being proved correct. For now it seems, anything goes on Facebook, and the implications of that are unlikely to be pretty.