New Times,
New Thinking.

  1. Comment
28 August 2024

How Telegram fosters online extremism

The under-regulated platform lacks transparency. The Southport riots remind us why this matters.

By Jacob Davey and Julia Ebner

“The fight will be long brothers, but we will succeed. Never capitulate,” read a message shared on the encrypted messaging app Telegram in the “Southport Wake Up” channel following a tragic attack on a Taylor Swift-themed dance class that left three young girls dead. Riots targeting Muslims and asylum seekers flared up across the UK.

This was the same channel that shared manuals for carrying out arson attacks and lists featuring dozens of names and addresses of immigration lawyers. In Dublin last November, after a knife attack on schoolchildren in Parnell Square, far-right agitators wasted little time in making a call to arms on Telegram. “Everyone bally [balaclava] up, tool up,” a man can be heard in a voice note. “Let’s show the f***ing media that we’re not a pushover. That no more foreigners are allowed into this poxy country.”

We have long known that the online information ecosystem – X, TikTok, Facebook, et al – harbours these kinds of overtures. But closed and encrypted messaging apps such as Telegram and Signal are particularly potent: both as communication hubs and safe havens for extremism. And these campaigns tend to take a dual-pronged approach: coordination via hidden channels alongside public rallying calls on outward facing platforms. With the recent reactivation of X accounts run by far-right agitators – like Tommy Robinson – these posts can reach millions of views, significantly scaling up the capacity to influence activity on the ground.

The riots united a previously splintered British far right: sympathisers of the white nationalist group Patriotic Alternative, football hooligans, anti-Muslim activists following Robinson, acolytes of Andrew Tate, and members of conspiracy-theorist networks were joined by people who thought of themselves as concerned citizens, and those with no apparent ideological motive (compelled, perhaps, by the chance to riot).

Various movements opportunistically put aside differences in ideology or tactics to coordinate based on their shared desire to see migrants, Muslims and other minority groups kept out of British life.

These dynamics are not new or unique to the UK. It is similar to what happened in the eastern German city of Chemnitz in 2018, when thousands of far-right activists descended into the streets chanting “foreigners out” following the circulation of disinformation via Telegram channels about the murder of a 35-year-old man. It is also reminiscent of the run-up to the US Capitol riots in which extremists, conspiracy theorists, radical Donald Trump supporters and other individuals connected to loose online networks on Telegram, Twitter, Facebook, and more fringe platforms such as Gab, 8chan and TheDonald.win, shared election fraud narratives and organised themselves.

The infrastructure of closed-chat applications accelerates dangerous group dynamics. Trigger points – events considered transformative or emotional, like a tragic attack or an election defeat – help foment something called “identity fusion”. Members on private channels merge their personal identities with a larger group identity. They start to view other members as kin-like, often referring to each other as “brothers” or “sisters” (evidenced perfectly by the Southport overture: “The fight will be long brothers.”)

Give a gift subscription to the New Statesman this Christmas, or treat yourself from just £49

Social media platforms lower the barrier for entry to extremist movements for the ordinary citizen, while the infrastructure accelerates the radicalisation process. All of this is abetted by the platforms’ business model – for example following Southport, X’s “Trending in the UK” feature promoted a false name attributed to the attacker, and also content from the actor-turned-influencer Laurence Fox suggesting the need to “permanently remove Islam from Great Britain”.

Conversations around platform accountability aren’t new – just earlier this year the CEOs of TikTok, X and Meta were hauled into a US Senate hearing. But in the wake of Southport those conversations are being vigorously revisited. There are emerging regulatory mechanisms to hold platforms to account in the UK, too. The Online Safety Act, which was passed in October 2023 after half a decade of drafting, requires platforms take proactive action to ensure they are prepared to protect users from illegal and harmful content such as hate speech or incitement to violence. Also, platforms will have to assess and mitigate other risks, including how their content recommendation systems can amplify illegal content.

However, much of the viral disinformation spread in the aftermath of the Southport attacks would not meet the threshold for illegality.

There are more problems for the authorities when it comes to regulating the wild landscape of the extreme ends of the internet. Platforms are making it increasingly difficult to access their data – since Elon Musk’s takeover of X, for example, it has become a prohibitively expensive endeavour for researchers to acquire necessary information, such as about what fuelled the disorder after Southport. Meanwhile, Meta – the parent company of Facebook – has reduced the functionality and accessibility of the tools that are intended to provide insight to the public about behaviour on its platform.

Erecting these access barriers makes it impossible for third-party independent researchers to build a comprehensive picture of the online landscape, hold companies to account, and potentially enforce regulation. This allows extremism to spread even wider than it already was. The UK’s act fails to address this, and is starkly at odds with comparative regulation, like the Digital Services Act in Europe which mandates meaningful data access for third-party researchers.

Under-moderated platforms such as Telegram have a track record in incubating and exacerbating terrorism and extremist violence. At these moments of crisis, regulation of businesses that profit from harm seems more necessary than ever; enforcing greater transparency is a good place to start.

[See also: The BBC has finally grasped how to manage a scandal]

Content from our partners
How Lancaster University is helping to kickstart economic growth
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?

This article appears in the 28 Aug 2024 issue of the New Statesman, Trump in turmoil