Sexual violence is pervasive online. It manifests in many ways, including revenge pornography, cyber flashing and upskirting, with victims predominantly being women and girls. Research from University College London in 2020 found that 76 per cent of the girls surveyed aged 12 to 18 had been sent unsolicited nudes or “dick pics”, while 70 per cent had been asked to send nudes themselves.
Last week, the women and equalities committee questioned ministers and policy officials about how the Online Safety Bill will help to protect women and girls from image-based sexual violence. This will be used to provide evidence for the committee’s wider inquiry into violence against women and girls.
What is the Online Safety Bill?
The Online Safety Bill is a long-awaited law that will attempt to make the internet safer, especially for vulnerable groups and children. It will tackle everything from illegal content, such as revenge porn and terrorism, to “legal but harmful” content, such as misinformation and misogyny. It is currently at committee stage in parliament, meaning it is being scrutinised line by line for proposed amendments. It is expected to become law in 2022.
How will it improve women’s safety online?
“Legal but harmful” is an ambiguous phrase that has proved contentious with both safety campaigners and freedom of speech advocates. A recent second look at the bill attempted to bring some of the “legal” abuse that women face under the scope of law.
Based on recommendations from the Law Commission, a new criminal offence will be introduced for cyber flashing – anyone who sends a sexually indecent photo or film for the purpose of their own sexual gratification or to cause the victim “humiliation, alarm or distress” could face up to two years in prison. Other new offences include threats to rape, kill or inflict violence (punishable by up to five years in jail), sending abusive messages, and “pile-on” harassment, where many people target an individual.
The Law Commission’s Penney Lewis, law commissioner for criminal law, has previously said that the recommendations simultaneously protect victims and free speech: “The criminal law should target those who specifically intend to cause harm, while allowing people to share contested and controversial ideas in good faith.”
Do the new laws go far enough?
Not according to the women and equalities committee. Jackie Doyle-Price, the MP for Thurrock, said that “surely consent, not intent” should be the criteria for prosecuting cyber flashers, while the committee chair, Caroline Nokes MP, argued that a “failure to include consent disregarded the impact on the victim”.
Chris Philp, minister for tech and the digital economy at the Department for Digital, Culture, Media and Sport, said the government had followed the advice of legal experts and that the new legislation mirrors laws around real-life indecent exposure. Rachel Maclean, safeguarding minister at the Home Office, added that “over-criminalising [activities and people] is a genuine concern”. Examples include the sending of images between two consenting adults, someone sending a photo of themselves on a beach with a nude sunbather in the background, or children under 16 who might lack the maturity to make sound judgements.
What else was discussed in the session?
Underage access to pornography, revenge porn, and how the regulator, Ofcom, will effectively spot and remove illegal content were also discussed.
Under the revised Online Safety Bill, porn websites will be required to have robust age checks in place to ensure users are at least 18 years old. This could be age verification, which requires users to upload proof of age such as a driving licence, or alternative measures such as facial recognition or pooling data from other websites. It will be up to websites to decide what system they have but they need to prove to Ofcom that it works and complies with its minimum standards.
Revenge porn has been criminalised since 2015. Philp said that the current mechanisms for flagging it are “patchy at best”, and that the Online Safety Bill will speed this up through tougher regulation. Websites will have to respond in a “reasonable and timely manner” to user requests to remove images – “within hours”, he suggested, with anything longer being “abhorrent” – and they will need to have a simple complaints process that is prominent on their website and easy to find.
The internet is vast and never-ending – how will this be policed?
A list of “priority illegal content” has been established in the revised Online Safety Bill, which includes revenge porn, rape porn and child sexual abuse. This places a duty on websites to be proactive and prevent people being exposed to these types of content in the first place rather than just take it down once it has been reported. Otherwise, tech giants could face fines of up to 10 per cent of global turnover or have their websites taken down. A named individual at the company could also go to jail if they refuse to comply with Ofcom’s requests.
Users will also be able to report illegal content they see to Ofcom, and a new “super complaints” process will mean that recognised bodies (such as child protection charity the NSPCC) can escalate an issue to the regulator, and Ofcom will be legally obliged to investigate.
Questions were raised around whether Ofcom would have the resources to proactively hunt content. Philp said that the regulator should combine regular “horizon-scanning” of the highest risk and biggest websites with responding to user complaints and super complaints.
Where can I go for support or to learn more?
If you are a victim of revenge porn, you can contact the revenge porn helpline, a service co-developed by the Home Office in 2015.
Read the summary of the Law Commission’s recommendations to the bill.