Support 100 years of independent journalism.

  1. Science & Tech
27 August 2017

Bots on the ground: can the rise of “killer robots” be halted?

Such weapons are already being developed by national militaries and terrorist groups.

By Sanjana Varghese

Killer robots – the phrase sounds as if it were lifted from a sci-fi film. But this is no mere fantasy.

On 20 August, 116 of the world’s leading artificial intelligence (AI) and robotics researchers signed an open letter urging the United Nations to ban the development and use of killer robots. The letter, signed by tech leaders such as Tesla’s Elon Musk and Mustafa Suleyman of Alphabet’s DeepMind, cited fears that the robots would unleash a “third revolution in warfare” and start a new arms race.

Public discussion of robotics and AI has become widespread after recent technological advances. Even then, the idea of murdering machines seems far-fetched. Yet such weapons are already being developed by national militaries as well as terrorist groups. In the absence of international action, the world risks becoming yet more hazardous.

A killer robot in the vein of RoboCop is still a distant reality. Depending on the definition, a landmine, which is technically the oldest automatically triggered weapon, could fall under the umbrella. Yet in their current iterations, killer robots, or lethal autonomous weapons (Laws), are specific military robots that have the ability to identify and attack targets without human intervention.

Most Laws still maintain a human “on the loop”, but many experts have predicted that fully functional ones could become a reality as soon as in 2040. Laws come in many guises, both for attack and defence purposes, and they are already being used by countries with advanced military capabilities, such as the US, Germany and India.

Select and enter your email address Quick and essential guide to domestic and global politics from the New Statesman's politics team. A weekly newsletter helping you fit together the pieces of the global economic slowdown. The New Statesman’s global affairs newsletter, every Monday and Friday. The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises - in your inbox every Thursday. Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday. A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday. A newsletter showcasing the finest writing from the ideas section and the NS archive, covering political ideas, philosophy, criticism and intellectual history - sent every Wednesday. Sign up to receive information regarding NS events, subscription offers & product updates.
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.

Israel’s Iron Dome is a defensive Law, with autonomous targeting and firing capabilities, and is renowned for its efficiency in destroying incoming rockets. The US navy has used radar-guided guns to identify and attack predators since 1970. A 2016 Wired article revealed that the Pentagon is currently researching at least 21 projects involving autonomous weapons. The UK, meanwhile, is working with arms company BAE Systems to develop the Taranis drone, which would be almost invisible and could incorporate full autonomy by the end of testing.

The Campaign to Stop Killer Robots (CSKR) has been calling for a pre-emptive ban on Law development and use since 2013. The weapons, the group argued, should be added to the list of those banned in 1983 by the UN Convention on Certain Conventional Weapons. Leading tech companies, researchers and organisations such as Human Rights Watch have also been successful in drawing international attention to the issue of killer robots.

In 2015, CSKR organised an open letter, also signed by Musk and Stephen Hawking, which served as the catalyst for the agreement of UN talks. The group was instrumental in the publication of the latest letter to coincide with the International Joint Conference on Artificial Intelligence in Melbourne.

Not all tech experts believe that a ban on Laws is justified. In a research paper for Stanford University, Kenneth Anderson and Matthew Waxman argued, “Just as increased automation in many fields is inevitable, automation in weapons will occur, and is occurring, incrementally” – echoing the sentiment of many other legal researchers.

Military experts have highlighted the difficulty of distinguishing autonomous weapons from others. Similarly, the Ministry of Defence responded to the recent open letter by stating that the British government does not support a pre-emptive ban on Laws (though it does not intend to develop the technology). Yet others maintain that a comprehensive ban is the only safe course.

The legal, ethical and humanitarian implications of Laws are profound. Experts have warned that some harmful consequences may not be clear until it is too late. As the latest open letter stated: “They [the weapons] will permit armed conflict to be fought at a scale greater than ever and at timescales faster than humans can comprehend.”

Human Rights Watch has warned of the difficulty of ascribing individual responsibility in the event of an accidental death or a misfire, as well as the danger of “undesirable hacking”. The weapons also risk being deployed by terrorists and extremists of all kinds.

Some of those fears have already been confirmed. A lengthy report on drones by the New America think tank stated that Isis created an “Unmanned Aircraft of the Mujahedeen” unit in 2017. Other terrorist groups such as Hezbollah have long used drones and other unmanned vehicles to carry out surveillance and counter-attacks.

At the end of 2016, 123 countries agreed to hold talks on Laws under the auspices of the UN Convention on Conventional Weapons. But the meetings have yet to take place; the most recent was delayed owing to committee members’ unpaid fees. Time is short to halt the rise of “killer robots”. As the open letter warned: “Once this Pandora’s Box is opened, it will be hard to close.” 

This article appears in the 21 Feb 2018 issue of the New Statesman, Sunni vs Shia