Since the Cambridge Analytica scandal in 2018, Big Tech companies have faced increasing pressure from governments to take action on fake news, disinformation and hate speech.
The UK government is the latest to bring forward legislation seeking to curb the spread of harmful online content. Reintroduced to parliament in March 2022, the Online Safety Bill, according to the government, “marks a milestone in the fight for a new digital age which is safer for users and holds tech giants to account”. The bill proposes that platforms should be liable for the content users post on their sites – with newly established powers for the regulator Ofcom.
To analyse the effects of the bill, the New Statesman convened leaders in politics, industry and civil society at the first of a new series of events – Spotlight Debates – hosted in central London on 30 March. Speaking in favour of the bill were Damian Collins MP, chair of the parliamentary joint committee on the draft online safety bill, and Alex Towers, director of policy and public affairs for BT Group, the sponsor of the debate. Speaking against the bill were Ruth Smeeth, chief executive of the Index on Censorship, and Victoria Hewson, head of regulatory affairs at the Institute of Economic Affairs (IEA).
In his opening remarks, Collins highlighted how the Online Safety Bill would hold Big Tech companies to account for the content on their platforms. The bill is “really clear that the companies have liabilities for the way they use platforms, particularly when they’re making these decisions to make money”, he said.
The “engagement-based business models” used by Big Tech, which continuously feed users with content to keep them “on the platform for as long as possible, and get them to return as often as possible”, can have pernicious effects, Collins argued.
This incorporation of algorithms into platform design, where users have content specifically recommended to them, can harm the most vulnerable users. “If you’re a vulnerable teenage girl – maybe you’ve had friends that have self-harmed; maybe you’ve engaged with content around self-harm yourself – you’re more likely to see that content,” Collins warned. “The more vulnerable you are, the more likely you are to be shown content that will make your vulnerabilities worse.”
Responding to Collins in her opening statement, Smeeth argued that the Online Safety Bill has changed tack since it was reintroduced to parliament, with its “new objective [being] to try and make the internet nicer, not safer”. “It started with the best of intentions,” she said, “but we’re now seven years into this debate, and it’s a mammoth piece of legislation that has moved so far away from its original intention to protect children”.
Now, she added, “we have a range of unintended consequences” as a result of the perceived change in objectives.
Key among Smeeth’s concerns is the inclusion of the “legal but harmful” rule included in the bill, which requires platforms to moderate content that is legal, but could potentially be harmful to people – especially children. “There is a new definition of speech in this legislation that will introduce speech codes,” Smeeth said, which will mean that conversations on issues “we could discuss in the pub… would not be allowed to be on my Facebook page.”
Tech companies may find it difficult to develop algorithms that understand nuance and context, which could lead to those speaking out about difficult personal experiences being silenced, Smeeth warned. The IEA’s Hewson later echoed Smeeth’s concern, noting people could find themselves censored for no good reason. “In effect, words that trigger algorithms… will simply be removed, like pre-crime, in the interests of a platform avoiding potential liability.”
Responding to Smeeth, BT’s Alex Towers acknowledged concerns over the regulation of speech online. “These judgments about … what is legal but harmful, I agree, is complicated – and I think it’s probably the place where most of the parliamentary debate is going to be focused.” Towers added that he would like to see “more, not less detail” about what would be covered in the rule – “obvious categories”, including bullying, misogynistic conduct and disinformation would be key to clamp down on, he said. The government recently announced that it would do more to tackle disinformation specifically; knowingly spreading false information to cause harm will be punishable by up to 51 weeks in jail.
Through increased governance and regulation of the internet, the bill has come under criticism from some civil society groups, which have made accusations of the proposals being a vehicle for state creep. “The Online Safety Bill… is stretching the concept of safety beyond all recognition, to justify a massive extension in the powers of the state,” Hewson said. “I’m not sure that any of this is going to make us any safer.”
Smeeth noted that the government decided against introducing “digital evidence lockers” in the current version of the bill, despite it being recommended by Collins and his committee.
Such a device would flag all the content and abuse that is deemed illegal and store it in a digital “locker” that victims, police and academics could access if needed. “I think it’s not unreasonable in a democratic society to see what people are deleting,” Smeeth said.
“I totally agree with the point about digital evidence lockers… I think it’s a really important point,” Towers said later in the debate.
On the bill more generally, Towers raised two key questions relating to BT Group: “is this going to protect our customers?” and “is this going to be able to adapt as technology evolves?” According to Towers, there is consumer demand for a more tightly regulated internet. “One of the reasons we’ve been so supportive throughout this process of the contents behind the bill is that the people who pay for our services have a high level of anxiety and unease about the online world.”
Towers noted that in focus groups last year, BT found that more than 80 per cent of people felt that online platforms, the government and regulators should be responsible for the content that is posted online.
“From that point of view, this bill seems to take the right kind of approach,” he said. “It places a lot of responsibility on businesses to set their own standards and live by them – but within a framework set out by the government and enforced by a regulator.”
The second reading of the bill will take place on the first day back from the Easter recess, on 19 April. “This feels like a piece of legislation we’ve been discussing in different forms for a very long time,” said Collins. “Now feels like a really great time to have this debate.”
[See also: The Online Safety Bill has created a free speech culture war]