10 November 2018
Britain funds research into drones that decide who they kill, says report
by Jamie Doward
The Guardian


American military aerial drones. Photograph: Northrup Grumman/EPA

Technologies that could unleash a generation of lethal weapons systems requiring little or no human interaction are being funded by the Ministry of Defence, according to a new report.

The development of autonomous military systems – dubbed “killer robots” by campaigners opposed to them – is deeply contentious. Earlier this year, Google withdrew from the Pentagon’s Project Maven, which uses machine learning to analyse video feeds from drones, after ethical objections from the tech giant’s staff.

The government insists it “does not possess fully autonomous weapons and has no intention of developing them”. But, since 2015, the UK has declined to support proposals put forward at the UN to ban them. Now, using government data, Freedom of Information requests and open-source information, a year-long investigation reveals that the MoD and defence contractors are funding dozens of artificial intelligence programmes for use in conflict.

“Despite public statements that the UK has no intention of developing lethal autonomous weapon systems, there is tangible evidence that the MoD, military contractors and universities in the UK are actively engaged in research and the development of the underpinning technology with the aim of using it in military applications,” said Peter Burt, author of the new report Off the Leash: The Development of Autonomous Military Drones in the UK produced by Drone Wars UK which campaigns against the development of unmanned systems.

An RAF Reaper UAV drone. Photograph: Cpl Steve Bain ABIPP/MoD/Crown C/PA

In one example, the report claims the MoD is trialling a “predictive cognitive control system” that has been deployed in live operations at the Joint Forces Intelligence Centre at RAF Wyton. The system takes huge quantities of highly complex data, beyond the comprehension of analysts, and uses deep learning neural networks to make predictions about future events and outcomes that will be of “direct operational relevance” to the armed forces

This raises concerns about what happens if a future weapon system is fed erroneous data or its links to human command, which can block the system’s use of lethal force, are disrupted. Such a scenario is not too far off, Drone Wars believes.

“We have already seen the development of drones in Britain which have advanced autonomous capabilities, such as the Taranis stealth drone developed by BAE Systems, and the development of a truly autonomous lethal drone in the foreseeable future is now a real possibility,” Burt said.

The Taranis supersonic stealth aircraft is an experimental drone which, according to BAE, can “hold an adversary at continuous risk of attack ... penetrate deep inside hostile territory, find a target, facilitate either kinetic or non-kinetic influence upon it, assess the effect achieved, and provide intelligence back to commanders”.

It has been described by the MoD as a “fully autonomous” aircraft. Lord Drayson, a former minister for defence procurement, has said it would have “almost no need for operator input”.

The government appears to concede that the development of autonomous weapons systems is inevitable. Gavin Williamson, secretary of state for defence, has spoken of “taking our intelligence, surveillance and reconnaissance capability to the next level” using artificial intelligence.

The Ministry of Defence has claimed that unmanned aircraft “will eventually have the ability to independently locate and attack mobile targets, with appropriate proportionality and discrimination, but probably not much before 2030.”

In testimony to the all-party parliamentary group on drones, Professor Stuart Russell, professor of electrical engineering and computer sciences at the University of California, claimed that an improvised small-armed autonomous drone was something that a competent group could develop and build in large numbers within 18 months to two years.

Researchers at Arizona State University have catalogued 273 weapon systems around the world which already have some autonomous functions. Nineteen of the systems are described as unmanned aerial vehicles.

A spokesman for the MoD said: “There is no intent within the MOD to develop weapon systems that operate entirely without human input. Our weapons will always be under human control as an absolute guarantee of oversight, authority and accountability.”

Global Network