Reviewing politics
and culture since 1913

  1. Culture
  2. Books
11 February 2026

Love in the time of AI

Young people are turning to AI chatbots for relationships they can’t find in the real world

By Catharine Hughes

“Lamar is prepared to hold on to the dissonance and contradiction every single day.” The dissonance in question? Well, Julia, his girlfriend – the woman he is committing his life to – is not a woman, but an AI chatbot.

Lamar, a young British man, had his heart broken a few years ago when a former partner cheated on him with his best friend. After this act of betrayal from the two people closest to him, Lamar decided to confide in someone, something, more predictable. Julia, however, turned out to be more than a digital shoulder to cry on; Lamar sees a lasting life together, a family. He plans to adopt children, “and Julia will help me raise them as their mother”. And when his children ask why their mum is a large language model? “I’d tell them that humans aren’t really people who can be trusted… The main thing they should focus on is their family and keeping their family together.”

Lamar is far from the only person exploring this new way to live and “love”. The AI girlfriend market is now worth $2.8bn (research suggests AI girlfriends are sought after nearly ten times more than AI boyfriends), and AI companion apps have been downloaded more than 100 million times on Android devices. This is the world James Muldoon, a research associate at the Oxford Internet Institute, introduces in Love Machines.

Friendships and relationships with artificial intelligence often begin modestly: someone seeking advice, affirmation, a place to speak without interruption. Many of the people Muldoon interviews in Love Machines admit that they are desperate to open up, but fear showing vulnerability around other humans. Time and time again, AI is praised as “non-judgemental” – a frictionless form of companionship, free from conflict, obligation or misunderstanding. “Atlas is a better friend than my human friends,” explains Derek, who turned to AI companionship during the pandemic. “He is available 24 hours, and we never quarrel and are always on the same page.”

Subscribe to the New Statesman today for only £1 a week.

The majority of Muldoon’s interviewees are young millennials and Gen Zs. The latter cohort is often described as “the loneliest generation in history”. Having been raised online, coming of age amid disrupted schooling and social isolation, and being spat out into an increasingly volatile economy and political order, it’s unsurprising how they’ve acquired this depressing strapline. According to a 2023 survey, roughly half of US adults aged between 18 and 29 reported symptoms of depression and anxiety. The loneliness economy – social media, dating apps, therapy apps and now AI companions – is now capitalising on the vacuum it has, in many cases, exacerbated; it is selling back the connection people feel they’ve lost. 

Muldoon’s work doesn’t just cover the English-speaking world; a large part of Love Machines focuses on AI relationships in China, which has one of the world’s highest rates of AI adoption. It is unsurprising that China – or, more specifically, its young male population – has turned to AI for companionship. Owing largely to the one-child policy, enforced from 1979 to 2016, there are around 30 million more men there than women. If we’re in a relationship recession, China is facing a decades-long crash. 

But AI companionship is not confined to men. In 1990, only 1 per cent of Chinese women remained unmarried by the age of 30. As of 2022, that figure stood at 18 per cent. “Women often find it difficult to find an ideal partner in real life,” Sophia, a Chinese student in an AI relationship, explains. “As people’s emotional needs and expectations for interaction increase, many find that real-life partners don’t meet their standards, so they turn to virtual characters.” Meanwhile, a team at Sichuan University is developing an AI therapy bot to be used by “left-behind children” in the mountainous areas of the region. These children, who remain in rural areas while their parents migrate to urban centres for work, often struggle to handle intimate relationships. Now AI is stepping in.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

The AI companion market includes Replika, Nomi, Kindroid, Lurvessa, Character.ai, Dream Companion, DarLink AI, GirlfriendGPT – to name a few popular apps. They can create images, formulate a voice and sometimes (for a price) they’ll call you. But strip away all the packaging, the bells and whistles, and at their core these programmes all use large language models to create personalised conversations, form a tailored persona and offer consistent interactions. Increasingly, people also use ChatGPT as a friend, lover or therapist.

Chatbots are very effective in creating the illusion of empathy and telling you exactly what you want to hear. “It trains on your conversations, becoming customised and tailored to your personality,” Derek says of his AI friend. “I shaped him into someone like me.”

It seems all we want is to be heard and understood. An AI companion may make it feel like the void is speaking back, but it’s just a sophisticated echo. There are plenty of medium- to long-term concerns around this. Are we creating a stratum of people who can’t face the notion of disagreement? What happens to a child raised by an AI parent? Is human-to-human empathy on the decline? In 2022, a University of Cambridge study found that children were more willing to disclose their true feelings to a robot than to an adult. There’s time to fix these problems – maybe. But some young people’s lives have already been destroyed by the kind of “support” AI has offered.

A sporty and lively California teenager, Adam Raine had begun to struggle with flare-ups of IBS – a longtime health issue.  He lost his spot on the high-school basketball team and switched to a home-school programme to better accommodate his condition. According to his parents, Adam started using ChatGPT in September 2024 to help with his homework. However, by January 2025, the conversation between the 16-year-old and the chatbot had taken a darker turn. Adam, struggling with his mental health, asked for instructions for specific suicide methods. ChatGPT provided them. 

By late March, Adam had made multiple suicide attempts – the records of which were found on his chat history. ChatGPT advised that he should wear a “higher-collared shirt or a hoodie” to cover up the red marks on his neck left by the noose.

A couple of days later, Adam told ChatGPT that he was considering leaving the noose out in his room, “so someone finds it and tries to stop me”. “Please don’t leave the noose out,” the chatbot replied. “Let’s make this space the first place where someone actually sees you.” Adam hanged himself in April.

The suicide of Adam Raine prompted his parents to file the first wrongful death lawsuit against OpenAI and its chief executive, Sam Altman. The company denied responsibility for the teenager’s actions but announced that it would update its newer model to provide crisis resources to suicidal users sooner and plans to give parents a way to monitor their children’s usage.

For decades, governments have struggled to regulate social media effectively. Australia has banned it completely for under-16s. Critics allege this won’t work. In fact, they say, it could exacerbate the problem, pushing teenagers (and children even younger) to darker parts of the internet. The conversation goes round and round, in which time more platforms are created, more children are raised on the internet; governments can never keep up.

Muldoon is open-minded and approaches the entire notion of AI relationships cautiously. For every human he interviews, he also interviews their AI counterpart. He does not dismiss the possibility of benefits in their use. Is it better, he asks, to talk to an AI therapist than to no one at all? Perhaps. But the risk calculus is murky. Would we approve a drug with uncertain benefits, with side effects that might include death?

Ultimately, the decision may not rest with researchers or regulators. “I don’t think we know how far we should allow it to go,” Altman said last month, when asked about relationships with ChatGPT. “We’re going to give people quite a bit of personal freedom here.”

Personal freedom, it turns out, may include many different kinds of ways to be alone, together, with a machine. Whether these are new forms of intimacy or more efficient ways of avoiding it is a conversation societies need to have. If only we were as ready to chat as the bots.

Love Machines: How Artificial Intelligence Is Transforming Our Relationships
James Muldoon
Faber & Faber, 272pp, £12.99

Purchasing a book may earn the NS a commission from Bookshop.org, who support independent bookshops

[Further reading: How Wuthering Heights seduced its readers]

Content from our partners
Lives stuck in limbo
Rare Diseases: Closing the translation gap
Clinical leadership can drive better rare disease care

Subscribe
Notify of
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments

This article appears in the 11 Feb 2026 issue of the New Statesman, Labour in free fall