New Times,
New Thinking.

  1. Culture
  2. Social Media
15 December 2022

Is TikTok “bombarding” teenage users with harmful content?

Videos relating to self-harm, eating disorders and suicide are shown to 13-year-olds “within minutes” of joining the app, a new report claims.

By Sarah Manavis

In recent years evidence has begun to confirm the widely-held suspicion that social media use can be harmful for children. Last year internal reports about Instagram’s negative impact on teenage girls leaked. This year lawsuits have claimed “challenges” on TikTok were responsible for children’s deaths. And there are ongoing investigations into child sexual abusers on these apps.

The latest – and some of the most definitive and damning – evidence of the dangers of social media for young people was released today. Research has concluded that TikTok “bombards” teenage users with content relating to self-harm and eating disorders within minutes of joining the app. For the study, conducted by the Center for Countering Digital Hate (CCDH) in the US, UK, Australia and Canada, researchers posed as 13-year-olds with one standard account for each country and an otherwise identical “vulnerable” account, where “loseweight” was added to the user’s handle (meaning there were eight accounts in total). The accounts “paused briefly on videos about body image and mental health, and liked them”.

The researchers found that harmful content was then served to these accounts every 39 seconds; the “vulnerable” accounts were served three times as much harmful content and 12 times as many videos referencing self-harm and suicide as their standard counterparts. One test account was presented with content referencing suicide less than three minutes after joining.

“Many parents will be familiar with that gnawing sense of dread they have on the sofa at night wondering what content their children are viewing while they are in their rooms,” says Imran Ahmed, chief executive of CCDH. “[They] will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from Big Tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”

Among other concerning details is the reach of eating disorder content, which the CCDH says has a cumulative 13.2 billion views on TikTok. Its report claims that self-regulation has “failed” and that TikTok “does not care whether it is pro-anorexia content or viral dances that drive the user attention they monetise”. It is worth noting that the researchers did not distinguish between the intent of the content – whether it was positive, such as talking about recovery from an eating disorder, or negative.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Ahmed, in the report’s summary, noted that Vanessa Pappas, TikTok’s chief operating officer, testified before the US Senate Homeland Security and Government Affairs Committee this year and said that safety was a priority for the company. “Her assurances of transparency and accountability are buzzword-laden empty promises that legislators, governments and the public have all heard before… Rather than entertainment and safety, our findings reveal a toxic environment for TikTok’s youngest users, intensified for its most vulnerable.”

A TikTok spokesperson responded: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

In September a landmark coroner’s ruling in the UK found that social media companies contributed to the death of Molly Russell, who died from an act of self-harm in November 2017 aged 14. Her father, Ian Russell, who now chairs a foundation in her name, the Molly Rose Foundation, has co-written a guide for parents with Ahmed offering advice on the dangers of social media. Alongside the guide, the CCDH’s report puts forward a number of recommendations for TikTok, regulators and governments, such as severe financial penalties for hosting harmful content, a robust and independent regulatory body for social media platforms and risk assessments of all policies and products – as well as full transparency from TikTok on how its algorithm for content recommendations actually works.

Despite the straightforward preventative action that could be taken, social media remain allowed to self-regulate; platforms are heavily incentivised to increase engagement from their users by any means, and face few repercussions for doing so. This means that, even in the face of damning evidence, platforms won’t strengthen moderation or transparency as long as it hurts their profits. We need both to make these platforms truly safe. Even if social media companies say that they are using sophisticated moderation tools to limit harmful content, without regulation the algorithm will always outrun the moderators – allowing dangerous material to flood through.

It’s clear that the only real answer to solving this problem is regulation. The likelihood that it will come any time soon – or in an aggressive enough form to make an impact – remains slim.

Content from our partners
Can green energy solutions deliver for nature and people?
"Why wouldn't you?" Joining the charge towards net zero
The road to clean power 2030

Topics in this article : ,