New Times,
New Thinking.

The public deserves protection from social media giants’ greed

No single law will fix all of the internet's problems, but the Online Safety Bill will truly hold Big Tech to account, writes the Tech Minister.

By Chris Philp

Social media websites and search engines have come to be dominant forces in how we communicate. But the tech companies running these services currently operate without any rules to follow beyond their own. In developing our new online safety laws, which will soon begin their passage through parliament, we’ve heard from countless young victims of a digital world that often prioritises profit over people’s safety. Tech companies have had ample chances to regulate themselves and have failed to do so. It is time to use the force of the law to make them behave responsibly.

The substantially updated Online Safety Bill we will shortly present to parliament is carefully designed and flexible, so it can deal with the rapid pace of technological change. It balances the need for a free and open internet with protections for children and vulnerable people.

The UK’s tech industries are blazing a trail in investment and innovation and so we have also designed a pro-competition and proportionate system to make sure we embolden the finest technological minds from around the world to build successful tech businesses in the UK. We will be placing duties on online platforms that host user-generated content, and the most powerful companies will be subject to the strictest rules to protect young people, prevent illegality and make sure harmful content – such as that promoting suicide – is properly handled.

[See also: What the Online Safety Bill means for social media]

The way to enforce laws is to arm regulators – in this case Ofcom – with tough new powers. They will have unprecedented sanctions at their disposal, including the ability to hand out multi-billion-pound fines if Big Tech fails to act, and the power to hold senior managers personally liable for some breaches. Ofcom will also be able to lift the bonnet and inspect the engines of these companies, including how their algorithms work – something that they have so far kept hidden.

We are not requiring anything unreasonable in getting major social media platforms to remove and limit the spread of content or activity that is against our laws. They must tackle child sexual abuse, hate crime and terrorist material, and do far more to protect children from being exposed to inappropriate material. All sites that host or display pornography will have to put effective measures in place to prevent children from accessing it – for example, by using age verification technology.

Importantly, the bill does not censor free speech; in fact, it will consolidate public discourse online, which is currently subject to the whims of powerful private companies that can remove content arbitrarily and wield significant influence over what people see.

Robust new legal duties in the bill will mean in-scope companies have to take steps to protect people’s free speech, and the regulation will not require companies to remove specific pieces of legal content. Ofcom will have duties to safeguard freedom of expression while carrying out its work.

We believe people should be able to speak as freely on the internet as on Hyde Park Corner. But there is a growing list of toxic content and behaviour on social media that falls below the threshold of a criminal offence but which still causes significant harm.

[See also: What’s illegal offline must be illegal online, says Damian Collins]

This includes racist abuse and the promotion of self-harm and dangerous disinformation designed to deter people from taking the Covid vaccine. These things are already expressly forbidden on the biggest and most popular social networks. But time and again it is allowed to stay up or, worse, is actively promoted to reach huge audiences. The firms should not be allowed to host or amplify such harmful content – which they say they don’t allow – without there being consequences.

Thanks to our work, companies will have to proactively assess the risks to their users from legal but harmful content. They will need to set out clearly in their terms and conditions which forms of legal but harmful content is acceptable for adults to see on their sites, as well as the systems and processes they have in place to mitigate it if they decide to prohibit it. If content breaks their terms and conditions, they will have to take it down. Ofcom will judge whether their decisions on taking down content are adequate or not. Users will be able to appeal if they feel their content has been removed without good reason.

Therefore, the bill does not force online platforms to remove legal free speech. Freedom of expression means nothing if not also the right to offend, and under our new laws adults will be free to read and post things online others may disapprove of. But the overall effect of these measures will be to close the gap between what these companies say is allowed on their sites and what happens in practice.

No one single piece of legislation is going to fix all the problems the evolution of the internet has thrown up. But we want to make digital companies fix their systems and be more transparent, without stifling the hotbed of innovation and investment that is the UK’s tech industry. The Online Safety Bill has been designed with suitable and transparent checks and balances so that Ofcom’s implementation of it delivers on the objectives decided and is scrutinised by our democratically elected parliament.

Of course, the government doesn’t have a monopoly on wisdom, which is why we gave the draft bill to a cross-party group of parliamentarians for scrutiny. In December, this joint committee submitted its report, which included recommendations on how the bill could go further. This scrutiny is a vital part of ensuring it delivers what is needed.

[See also: Online abusers and trolls could be jailed for five years]

I am confident the bill strikes a careful balance to achieve a global gold standard for internet safety. But we are considering the committee’s recommendations, those of the Commons select committee and the thoughts of other leading parliamentarians as we substantially update the bill prior to introduction. We’ve already toughened it with new criminal offences and extra measures to force social media companies to stamp out the most harmful illegal content and criminal activity on their sites quicker, including revenge porn, hate crime, fraud and the sale of illegal drugs or weapons.

The public rightly demands and deserves protection from the harms created by the unfettered avarice of some large social media firms. The government, working with MPs across parliament and other parties, will deliver.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Content from our partners
Homes for all: how can Labour shape the future of UK housing?
The UK’s skills shortfall is undermining growth
<strong>What kind of tax reforms would stimulate growth?</strong>