Reviewing politics
and culture since 1913

  1. Technology
31 July 2025

We must fight the deepfake future

Penny Mordaunt’s suffering should serve as a national call to arms.

By Zoë Huxford

Penny Mordaunt broadsworded her way into Britain’s collective imagination when she became the unexpected breakout star of King Charles III’s May 2023 coronation. We had lost one stoic queen; here was another. Mordaunt bore the Sword of State, the heaviest in the royal collection, for 50 minutes. The world watching, Mordaunt kept her face composed, the image of ceremonial gravitas, strength, tradition and honour.

But imagine that face smeared across violent pornography. Speaking to BBC Newsnight recently, the former Conservative MP and cabinet minister revealed that she had been a victim of deepfake pornography while serving in parliament. Her face, along with those of other female MPs including Priti Patel and Angela Rayner, had been digitally placed onto explicit videos. “It was deliberately humiliating and violent,” she said.

Deepfakes are the latest grotesque frontier in the battle for digital dignity, where artificial intelligence is weaponised to humiliate, disempower, and violate women’s bodies. And the harm inflicted is not virtual – it can be as real as any other form of sexual violence. Headlines in 2013 may have asserted otherwise: “No harm in simulated rape videos (as long as they are well made), say ministers” ran in the Telegraph. Though this predates the inception of deepfakes by a few years, it is grim that, even today, some still think this basic principle of female autonomy is up for debate. Digital violence is violence, as Mordaunt understands. “The people behind this,” she said, “don’t realise the consequences in the real world when they do something like that.” 

Since the first deepfake was created in 2017, AI-generated, sexually explicit videos have proliferated across the internet. A study assessed that half a million deepfakes were shared in 2023; this year’s total is expected to be eight million. Of all deepfakes, 98 per cent are sexually explicit, and 99 per cent of those are of females. This technology is both misogynistic and, as it stands, unregulated. Worse, it is now so sophisticated that viewers no longer realise they are consuming fakes. We stand on the precipice, looking at potentially an entire generation of young males whose sexual understanding of consent is being warped by digital hallucinations.

Subscribe to the New Statesman today for only £1 a week.

Keir Starmer’s government has shown some willingness to take on issues related to deepfakes. Amendments to the Online Safety Act, which require pornography websites to implement age-verification measures, came into force on 25 July. The aim is to prevent children from accessing explicit material, and thereby protect them. But we might note the unnecessary protractions before the legislation was introduced. (While it is illegal to distribute deepfakes, it is legal to create one. Rishi Sunak pledged to legislate against the production of deepfakes in April 2024 though said legislation never materialised; Keir Starmer pledged the same in January 2025, yet production remains legal.) We may also note that a lot of porn lives outside of traditional porn sites, instead circulating on the murky backwaters of Telegram groups, Reddit threads, and 4chan.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Whatever the measures, we need more of them. AI-driven deepfake porn is a disturbing new theatre of abuse advancing, like all AI developments, at an alarming pace. But technology is made by humans. The scaffolding of our digital lives is designed, curated, and upheld by other people. The sword Mordaunt held at King Charles’ coronation was historic and symbolic. Today, her sword is rhetorical: a call to action against the degradation of female autonomy, identity, and safety in a world that increasingly treats women’s faces and bodies as public property.

Mordaunt has exposed a frightening fault line in British society. Children are given unfettered access to pornography. Women are transformed into digitally altered chimeras without consent and without recourse. Allowing this to continue is not just a regulatory failure but a cultural one. Technological change is relentless; violence against women is perennial. The internet is hard to contain and full of malicious actors. But we must summon the will to protect basic privacies and dignities. However heavy, we should pick up and carry that sword.

[See also: Schools need more sex education, not less]

Content from our partners
Lives stuck in limbo
Rare Diseases: Closing the translation gap
Clinical leadership can drive better rare disease care

Subscribe
Notify of
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments