View all newsletters
Sign up to our newsletters

Support 110 years of independent journalism.

  1. Spotlight on Policy
21 September 2021

Does how you talk to your AI assistant matter?

Verbal abuse towards female-gendered artificial intelligence platforms is prompting action

By Samir Jeraj

Back in 2019, Amanda Curry, a PhD student at Scotland’s Heriot-Watt University, was a team leader working on a project to build an AI. This was part of a challenge issued by Amazon Alexa, with successful teams getting to test their AIs on its millions of US customers. It was a huge opportunity for academic researchers to test their work on this scale.

“One of my jobs as part of the challenges was to go through the transcripts of the conversations and to look for places where the conversation had gone wrong,” Curry explains. She was expecting to find bugs in the system or functions that were missing and needed technical fixes. What she found was more disturbing: people were abusing and sexually harassing the AI, even threatening to rape it.

About 5 per cent of interactions with Alexa could be classified as “abusive”, but that proportion is much higher for other AIs. Kuki (formerly Mitsuku), a Japanese anime-style AI of a young woman, reportedly experiences abuse in around 30 per cent of its conversations with users. This could be related to the racist and misogynistic stereotypes of young East Asian women embedded in Western culture. The big concern is that these behaviours, if normalised in interactions online and with AI, will spill over into the physical world. “We tend to apply the same social scripts that we have with humans… to things with any kind of human characteristics,” explains Curry.

Online abuse of women, which was endemic before the pandemic, escalated sharply in 2020. The End Violence Against Women Coalition found that almost half of women and non-binary people have experienced online abuse since March 2020, with a third saying the abuse has been worse in this period. Black women and nonbinary people were targeted even more heavily, according to its research. The trend was also reflected in how people talked to their AI assistants, with Amazon’s Alexa experiencing an increase in harassment.

“It’s a feeling of control, power, dominance over something that has human features,” explains Liam Barnett, a relationship expert who has come across this issue in his work. When someone vents at an AI assistant, they know they can do so without it becoming defensive, Barnett explains, and, in fact, it is often submissive.

Select and enter your email address The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

“I find it sad at times,” he says, “because a lot of times it is a reflection of [clients’] deep parts of themselves coming out through communicating to a ‘helpless’ and ‘polite’ machine.” Often people do not even notice it or how it might reflect deeper issues for that person, which Barnett thinks is concerning.

I n 2019, Unesco published a report called I’d Blush If I Could detailing its concerns about how AI assistants “reflect, reinforce and spread gender bias” and “model acceptance and tolerance of sexual harassment and verbal abuse”.

This is, it notes, in an industry dominated by men. Google’s new hires in 2020 were 66.3 per cent men and 33.7 per cent women, and bias in products tends to reflect the people who built them. Unesco’s advice included recruiting more diverse teams in the sector, ensuring that digital assistants were no longer “female by default”, and programming AIs to discourage gender-based insults and abuse. Verena Rieser researches conversational AI (“people engaging in conversations with machines, such as voice assistants and robots”) at the National Robotarium hosted by Heriot-Watt and was the faculty advisor for the Alexa challenge Amanda Curry worked on.

The severity and type of abuse directed at the bot was unexpected. They looked at why it was happening, and while some abuse was the result of frustration with the functioning of the AI, the team observed severe cases of gender-based abuse directed at the female persona of their bot. In some interactions people were jumping straight into abuse or suddenly changing from a normal conversation.

To tackle the problem, you need to be able to identify abuse correctly, decide an appropriate response, and figure out how the design of an AI assistant’s persona could be changed in order to prevent abuse or reinforcing of stereotypes. Each of these is really difficult. Abusive language can be culturally specific or coded; for example, subtle forms of harassment that are context-dependent or suggestive as opposed to using specific words that are commonly understood to be offensive.

“As soon as you anthropomorphise you also have the problem of gender,” explains Rieser. One of the team’s findings was AIs with female personas were the subject of a greater amount of sexualised harassment and gendered violence in language. Nor did claims that certain systems were “genderless” hold up, both in how they were perceived nor in their responses. For example, some AIs will give a “poetic” answer to a question like “are you human?”, along the lines of “I am like the Aurora Borealis”, but will have a favourite TV show.

Different AIs will respond to abuse in different ways. The team tested out common verbal and sexualised abuse on various AIs and found that some avoid answering or just said “I don’t know” or “I don’t understand”. Curry asked people to rate how appropriate these responses were and they were generally critical.

“As soon as computers start to speak, they enter the social domain,” says Matthew Aylett, chief scientific officer at speech synthesis researchers Cereproc. Aylett has been working on a project to develop a non-binary voice for an AI assistant. He is sceptical of the idea of a “neutral” voice or accent, and points out the change in voices on the BBC now compared with 40 years ago. For Aylett, the two important issues are whether AIs should have a gender or other characteristics, and should different communities and groups be represented in AI voices?

The company’s work on a non-binary voice was prompted by the same Unesco report detailing the abuse of female-voiced AI assistants. “There’s always been a gender bias in the commercial world for female voices,” Aylett explains. He ascribes a lot of this to various forms of unconscious bias on the part of customers and engineers who do not often appreciate the social dimensions of what they are developing.

Cereproc worked with IT services and consulting firm Accenture, which had been involved with a “genderless” voice for AI called Q, to develop a specifically non-binary voice. They got a voice talent to record sounds they could manipulate, along with a nonbinary person for rhythm, stress and intonation of speech, in order to model the voice. Being able to understand what accent and language usage means in signalling gender, class, race and a whole range of identities is crucial according to Aylett. “For us, inclusivity is part of voice.”

Amanda Curry has now switched her PhD to studying the abuse of AIs and is currently working on detecting it and how effective different strategies are at reducing and stopping abuse, such as making the AIs be more assertive. For her, abuse of AIs is important because it reflects both deeper, personal issues for individuals, as well as societal issues. “Now I feel like I sound very naive, [but] I was surprised that someone would hear something that sounds like a woman and try to sexually harass it,” she says.

Content from our partners
Development finance reform: the key to climate action
Individually rare, collectively common – how do we transform the lives of people with rare diseases?
Future proofing the NHS

Topics in this article : , ,
Select and enter your email address The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU