New Times,
New Thinking.

  1. Science & Tech
5 March 2018updated 01 Jul 2021 12:15pm

After “abhorrent” posts on the Florida shooting, social media firms must be made liable

We’re letting the likes of Facebook get away with hosting hate and lies.

By John Mann

Mark Zuckerberg’s personal challenge for 2018 was to “fix” Facebook: specifically, to address fake news and hate speech. A week after the Florida school shooting, the company was again being put to the test. In what Facebook rightly branded “abhorrent” posts, claims were made that the shooting survivors were paid actors, or part of some sort of conspiracy. The New York Times reported on efforts by Facebook, beset by the trolls, to remove or limit visibility of such posts and to strip a virtual reality shooting game from demonstrations of the company’s new products. 

This effort to regulate some of the worst material on the platform is a welcome change in outlook, and one that has developed as the company has matured. Yet, at a parliamentary panel event we recently helped organise, a Facebook representative argued that whilst there is disgusting and horrible content on the platform, the company had to be dispassionate when setting rules for its global usership. In some cases, the audience was told, the company goes beyond the law; but when it came to Holocaust denial, a form of antisemitism, the fact that Britain (and a number of other countries) doesn’t explicitly outlaw this speech justified its continued presence on the platform. Facebook warned that free speech is paramount and it shouldn’t be for private companies to restrict it.

There are other examples: according to Amnesty International researcher Laura Haigh, if you’re a Rohingya activist you might well find your page shut down, while anti-Rohingya government propaganda continues to flood the site. Herein lies the contradiction: Facebook does take action on nudity, copyright and now some forms of conspiracy, doing so because it can. This does not make it the arbiter of global free speech, rather the master of the content on its platform. The company is changing the format of news feeds this year, ensuring friends’ messages, rather than posts from businesses, media and brands, are prominent. Companies are curating content, and doing so happy in the knowledge that they won’t pay the legal price for mistakes.

Social media companies have for some 20 years been enjoying the commercial advantage of immunity from prosecution. America’s well-intentioned Communications Decency Act 1996 conferred immunity on the operators of internet services which are not deemed publishers of, and therefore not legally liable for, the words of third parties who use their services.

This approach was adopted across the Atlantic by the EU and, as the Lords Communications Committee explained, when the E-Commerce Directive was harmonised into UK law in 2002, it gave “immunity to websites from damages or criminal sanctions where they act merely as a conduit, cache or host, so long as they operate an expeditious “take down on notice” service”. The Lords Committee viewed that “Parliament has thus accepted…that the liability of website operators should be limited in respect of content they host but which they have not originated”. A view furthered by the passage of the 2013 Defamation Act.

Recently, I tabled a bill in parliament recommending that social media companies be “liable for online publications in respect of civil proceedings in specified circumstances”. I also proposed rules on disclosure of information and an overseeing commissioner.

Mine is not the only effort in this area. A German bill already requires social media companies to assign a complaints representative, responsible for taking down or blocking evidently criminal content within 24 hours of receiving an initial report (or seven days later if complicated). Companies in Germany could receive fines of up to €50m for failing to act. Following the German lead, the European Commission, alongside the major social media companies (Facebook, Twitter, YouTube and Microsoft), announced, and committed to, a voluntary code of conduct in an effort to combat criminal hate speech online. In January, the European Commission released its third evaluation of the code since December 2016. The evaluation found that, despite a number of improvements, some 30 per cent of illegal hate material remains online when reported by NGOs or others.

Give a gift subscription to the New Statesman this Christmas from just £49

Back in Britain, the Home Affairs Select Committee recommended sanctions for companies failing to remove illegal content on request, and last year’s Digital Economy Act mandates the creation of a code of practice for online social media platform providers. Then, following further recommendations by the Committee for Standards in Public Life, the Lords debated publisher liability. Leading the debate, Baroness Kidron rightly pointed out that social media companies are commissioning, editing and curating content for broadcast or publishing. To this end, she argued, they should be liable. My bill and hers, taken together, demonstrate that both Houses of Parliament are in agreement.

We should be bringing social media in line with our approach to other matters. Media affects us all, significantly impacting our societal values, and so is regulated. That is why on television, if Peppa Pig cartoons were spliced with horror and inappropriate content, action would be taken (far quicker than it was by YouTube) and not outsourced to children for parents to report. Misinformation online could lead to an outbreak of measles, as we’ve seen through the fake news about inoculations, spread through social media and elsewhere. Failing to remove hateful and illegal content inspires onward acts and discriminatory discourse, spreading racism, antisemitism and other forms of hate.

The lack of consistency extends beyond hate material. Why is it that Ofcom regulates radio more than podcasts? Why is it that swearing cannot occur on TV before 9pm, but plug in an Amazon fire stick and the same rules don’t apply? Why should gambling and alcohol industries be compelled or relied upon to support industry-wide awareness initiatives, but social media be absented? Why can Ofcom fine telecoms giant BT £70,000 for failing to provide information as part of a market review, but social media companies can frustrate select committees seeking data on fake news?

By introducing liability and having a better regulatory approach, we can democratise and civilise the way our online world works and better safeguard the future, preparing our children for the digital experiment we are all a part of.

John Mann MP is the Chair of the All-Party Parliamentary Group Against Antisemitism

Content from our partners
Putting citizen experience at the heart of AI-driven public services
Skills policy and industrial strategies must be joined up
How the UK can lead the transition to net zero

Topics in this article :