Support 100 years of independent journalism.

  1. Spotlight
7 June 2019updated 08 Sep 2021 2:07pm

How AI risks replicating the prejudices of the past

Artificial intelligence has a responsibility to modernise alongside the society it serves.    

By Kriti Sharma

The biggest issue that artificial intelligence faces at the moment is not a problem of technical advancement – there are leaps being made all the time. It is about designing systems and products that humans can trust. And trust comes from transparency, responsibility, and ethical design. If an algorithm tells you to do something, you won’t do it if you don’t have confidence in its motives

Perhaps the most important hurdle that AI needs to get over is the issue of bias. The algorithms in AI systems are trained using large datasets, and if those underlying datasets are biased, the output is likely to be biased as well.

This creates problems if you’re using AI in systems such as making credit decisions about who gets a mortgage or a credit card, or who gets invited to a job interview. If there are historic patterns, such as a higher concentration of men in senior leadership roles, then the AI is going to make its decision based on those patterns.

Bias can also be introduced to AI by the people working on it. I don’t believe it’s malicious, but if teams are not diverse then bias emerges, at a very low level, during the design process, and this affects the end product. The statistics on gender equality in AI are fairly depressing – women make up about 12 per cent of the workforce.

Take the growing number of voice assistant devices powered by AI. Voice assistants such as Alexa, Siri, Cortana and Google Assistant have female voices or feminine personalities. And they do mundane tasks, such as switching your lights on and off, or ordering your shopping, and playing your favourite music. The “male” AIs, such as IBM Watson and Salesforce Einstein, are the ones designed to make important business decisions.

Select and enter your email address Quick and essential guide to domestic and global politics from the New Statesman's politics team. A weekly newsletter helping you fit together the pieces of the global economic slowdown. The New Statesman’s global affairs newsletter, every Monday and Friday. The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises - in your inbox every Thursday. Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday. A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday. A newsletter showcasing the finest writing from the ideas section and the NS archive, covering political ideas, philosophy, criticism and intellectual history - sent every Wednesday. Sign up to receive information regarding NS events, subscription offers & product updates.
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

It’s not just a question of gender, either. There is evidence that facial recognition systems are biased against ethnic minorities and women, because the algorithms were trained on certain kinds of faces. And background, too, makes a difference. People’s social mobility and education level impact the kinds of problems that they are interested in solving with AI, and this affects the question of who the technology is being designed for. It would certainly be a shame if the greatest technological advancement of our times wasn’t used for social good – improving healthcare for all, providing high-quality education, reducing inequality and so on.

Content from our partners
Harnessing breakthrough thinking
Are we there yet with electric cars? The EV story – with Wejo
Sherif Tawfik: The Middle East and Africa are ready to lead on the climate

I’m optimistic, because policymakers and legislators are now deeply interested in this topic. I find that very encouraging and refreshing. But I do feel there needs to be more responsibility taken by businesses. I genuinely believe that the future of our society should not be designed just by geeks like me. We need to allow a wider combination of people – people who are concerned with law, ethics, anthropology and the humanities – to take part in the AI movement, and not just be people it happens to.

Kriti Sharma is the founder of AI for Good and technology advisor to the UK government and the United Nations.

Topics in this article :