In 2021, the Internet Watch Foundation (IWF) reported the worst year on record for child sexual abuse material online. The charity found more than 250,000 URLs containing images or videos, an increase of nearly 100,000 on the year before.
What’s more, there is a clear gender divide in terms of victims. IWF’s recent research into self-generated child sexual abuse material – where people under 18 are pressured or coerced into taking nude or sexual images or videos of themselves and sharing them – shows that girls make up the majority of those targeted.
The research found 77,621 instances of self-generated abuse in the first half of 2022, of which nearly 97 per cent included images or videos of girls, and a quarter were of children aged seven to ten. Girls aged 13 and under made up 95 per cent of all the material found, with the most common cohort being ages 11 to 13.
Laura Nott, schools project manager at child sexual abuse charity, The Lucy Faithfull Foundation, says that all children who experience abuse online should be protected, and that we “should not overlook the victimisation of boys” too.
However, girls are more often coerced. This is down to various social and cultural factors, including boys gaining “social capital” through sharing images of girls, often through social media groups and forums, and girls feeling pressured to take images to appease boys or secure a romantic relationship.
Jessica Ringrose, a professor in sociology of gender and education at University College London (UCL), has conducted research into image-based sexual abuse. Her study found that girls felt under more pressure to send nudes than boys, and that this was often initiated through boys or men sending unsolicited nudes themselves, which Ringrose refers to as “transactional dick pics”.
She tells Spotlight that this is an indication of the “sexual double standards” that exist in society, where boys are generally “rewarded” for this behaviour, while girls tend to be “victim-blamed” and “shamed”.
“If you look around in popular culture, there’s much more imagery of women’s nudity,” she says. “Women’s bodies are valued differently to men so it’s not surprising that these norms replicate themselves in young people’s digital sexual cultures.”
Social media facilitates the issue, she says, and should be better regulated. Young people are encouraged to have open profiles on certain platforms to gain more followers, which leads to more strangers contacting them and sending them unsolicited requests for nudes. “I think that the platforms themselves need to do better to protect young people,” she says. “They need to try harder to help young people with their privacy issues.”
The Online Safety Bill, which has been delayed but is expected to be introduced this year, will look to tackle issues such as child sexual abuse by placing new regulations on social media platforms, with the threat of large fines, website shutdowns and criminal prosecutions if they do not comply.
However, campaigners have said that violence against women and girls (VAWG) is falling through the cracks of the bill. A new law to tackle cyber flashing – sending unsolicited “dick pics” – places the emphasis of the crime on intent, rather than consent, with victims having to prove that the image was sent with the intention of causing harm, distress or humiliation, similar to existing revenge porn laws, under which the crime is punishable by up to two years in prison.
Ringrose is one of many who believe that if misogyny were made a hate crime it would help tackle the problem. The government has confirmed, however, that it will not do this after legal experts found it could be “more harmful than helpful”.
Nott adds that keeping children safe online should be a priority of the new bill, and that there should be stricter regulations on pornography websites to tackle underage access and issues such as revenge porn.
“Age verification alone is not enough, and more statutory regulation is needed so that breaches can be properly enforced and consistently protect girls from the commercial exploitation of their images,” she says.
When it comes to self-generated imagery, education is the most crucial thing that can help children stay safer online, says Ringrose. Better digital literacy in school and open conversations at home could be a more preventative measure, helping young people understand issues such as consent, legality and the speed at which images can be shared.
“I think some of these behaviours around masculinity and entitlement need to really come under the microscope,” she says. “There are a lot of problems around young people’s lack of understanding of consent in online environments. Having conversations about what is legal and illegal can go a long way in raising their awareness.”
Parents and carers who are concerned can contact the anonymous Stop It Now! Helpline for support around child sexual abuse, and young people can report sexual images they see online via Childline’s “remove and report” service, which can help to get them removed.
[See also: Why OnlyFans wants us to regulate the internet]