On Thursday afternoon (23 April), people were loudly praising TikTok. The video-sharing app had just announced that, less than a month after joining, the former English Defence League leader Tommy Robinson and Britain First had been banned from the platform. “Keeping people on TikTok safe is a top priority and our Terms of Service and Community Guidelines clearly outline what is not acceptable on our platform,” a spokesperson said of the decision. “Content that seeks to promote hateful ideology has no place on TikTok and the accounts belonging to both Tommy Robinson and Britain First have been permanently removed for multiple violations of our Terms of Service and Community Guidelines.”
Compared to other social media platforms, this was notably swift movement by TikTok. Facebook and Twitter only banned far-right accounts, such as those of Robinson and Alex Jones, in 2019 after years in which they used these platforms to gain immense popularity through racist and xenophobic content. However, while many still perceive TikTok as a place for memes, lip-syncs and dance videos, far-right figures are thriving on the platform – posting “traditional” TikTok content, but also spreading misinformation and conspiracy theories.
The app has long been favoured territory for the far right. “TikTok is increasingly a home for edgy, anti-SJW [social justice warrior] political humor,” Paul Joseph Watson, the former InfoWars editor and conspiracy theorist, tweeted in January 2019, “which is why the control freak left hates it, and hates Gen Z by extension.” Although he has not explicitly tweeted about TikTok since, Watson posts several videos a month on the app itself, linking the spread of coronavirus to open borders, falsely equating gatherings in San Francisco’s Chinatown to the spread of the virus, and criticising scientifically backed social distancing measures as “mass conformity and blind obedience”. In fairness, these videos do not do particularly well – Watson has fewer than 6,000 followers on TikTok compared to 1.1 million on Twitter and 1.8 million on YouTube. But some of his videos have received hundreds of thousands of views, the majority of which are likely to be from Twitter’s largely millennial users.
Paul Joseph Watson(@pauljosephwatson) has created a short video on TikTok with music original sound. The new normal. #foryoupage #frontpage #foryourpage #foryou
As well as posting on TikTok, alt-right figures have also begun to weaponise other people’s TikTok content to promote their hard-right views. Robert Frank, an influencer with over half a million Facebook followers, a quarter of a million YouTube subscribers, and 140,000 TikTok followers, posted a video on 23 April arguing that hospitals couldn’t be overwhelmed by Covid-19 owing to the hundreds of videos of doctors and nurses performing dance routines and lip syncs on TikTok. “[These hospitals] are supposed to be a war zone, this is supposed to be the brink of the pandemic,” he says to camera. “I know what you’re going to say, ‘Well, Robert, not every hospital is overwhelmed, there’s hospitals in North Dakota that only have had five cases, those are the doctors and nurses you’re seeing putting up these videos.’ But I’m not buying it.”
Alongside videos such as this, Frank has also posted racist “bat eater” videos on TikTok itself, as well as criticism of face mask usage (“not working”) – in one video he even admits that he believe some masks do work, but that he wouldn’t take down a previous, popular video arguing they didn’t. While most of Frank’s videos don’t attract more than 50,000 views, the “bat eater” has received over a million.
ENOUGH WITH THE MASKS!! pic.twitter.com/xyFCjfa7n
— Robert Frank (@Robertfrank615) March 29, 2020
Other alt-right stars are tepidly beginning to build a TikTok presence, having been hounded off other social media platforms. Tim Gionet, better known by his moniker Baked Alaska, was a superstar of the alt-right who helped organise the infamous Trump inauguration party “the DeploraBall” and was a keynote speaker at the deadly 2017 Charlottesville white supremacy rally, but was later banned from Twitter for undisclosed reasons. After effectively disappearing from the internet, barring a statement in 2019 condemning the alt-right in the wake of the New Zealand Christchurch shooting, he has begun regularly posting on TikTok to a relatively small audience.
In TikTok’s statement on removing Robinson, a spokesperson said: “We’re continuously enhancing our efforts to ensure that individuals and organisations seeking to promote any form of hateful ideology cannot establish a sustainable presence on TikTok.” While many of the alt-right cleverly veil their language on TikTok to make it appear as if they’re merely posting a harmless opinion (rather than a potentially dangerous view), it’s hard to see how TikTok could let so many of these videos slide.