Support 100 years of independent journalism.

  1. Science & Tech
13 July 2015

Twitter’s new porn-spotting robot moderators

The social networking site has introduced new artificial intelligence systems that can spot and delete sexual and violent images – and spare human moderators in the process. 

By Barbara Speed

Under the compelling headline “The labourers who keep dick pics and beheadings out of your Facebook feed”, journalist Adrien Chen delved last year into the little-known world of social media’s content moderators. These thousands of workers, most based in Asia, trawl through social networking sites in order to delete or flag offensive content. In the process, they are exposed to the very worst the internet has to offer – beheadings, violent pornography, images of abuse – all for wages as low as $300 a month.

But this month, Twitter has taken a first step towards automating this process, and thus sparing a huge unseen workforce from their daily bombardment of horrors. Almost exactly a year ago, Twitter bought start-up Madbits, which offers, in the words of its co-founders, a “visual intelligence technology that automatically understands, organises and extracts relevant information from raw media”. 

At the time, tech websites speculated that the Madbits would be used to develop facial recognition or tagging on Twitter photos. But in fact, the start-up’s first task was very different: it was instructed by Alex Roetter, Twitter’s head of engineering, to build a system which could find and filter out offensive images, defined by the company as “not safe for work”. 

This month, Wired reported that these artificial intelligence (AI) moderators are now up and running. Roetter claims the new moderator-bots can filter out 99 per cent of offensive imagery. They also tend to incorrectly identify about 7 per cent of acceptable images as offensive – but, the company reasons, better safe than sorry. 

Like other artificial intelligence robots, the moderator “learns” how to spot offensive imagery by analysing reams of pornography and gore, and then applies its knowledge of the content and patterns to new material. Over time, the system continues to learn, and get even better at spotting NSFW images. Soon, systems like these could replace content moderation farms altogether.

Select and enter your email address Quick and essential guide to domestic and global politics from the New Statesman's politics team. A weekly newsletter helping you fit together the pieces of the global economic slowdown. The New Statesman’s global affairs newsletter, every Monday and Friday. The best of the New Statesman, delivered to your inbox every weekday morning. The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises - in your inbox every Thursday. Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday. A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday. A newsletter showcasing the finest writing from the ideas section and the NS archive, covering political ideas, philosophy, criticism and intellectual history - sent every Wednesday. Sign up to receive information regarding NS events, subscription offers & product updates.
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
I consent to New Statesman Media Group collecting my details provided via this form in accordance with the Privacy Policy

In most cases like these, it’s worth remembering those whose jobs might be lost as the robots advance, especially in developing countries – but considering the psychological damage brought on by endless exposure to violent images, we can only hope Twitter and sites like it can offer less distressing moderation jobs (and higher salaries) to these workers instead.