A UNSW Canberra military ethicist examines the arguments for and against the adoption of so-called “killer robots” in a new book.
Associate Professor Deane-Peter Baker said that his time on the International Panel on the Regulation of Autonomous Weapons (IPRAW) from 2017-19 and further policy work have informed Should we Ban Killer Robots?, released on Monday.
Professor Baker said he had no doubt more robots would participate in wars, and the UN believes the first victims of them were killed in the Yemen conflict, but he thought they were “unlikely to play much more than a supplementary role for some time to come.”
Lethal autonomous weapons systems (LAWS) operate without a person “in the loop”. Professor Baker believes that sophisticated versions would be expensive and rare in the medium term, while simpler systems would be limited in what they could do, leading him to believe that a Terminator-like scenario would be a long way off if it ever arrived at all.
Artificial intelligence researchers have been warning about the development of LAWS – able to identify, track and destroy targets — for decades, wrote UNSW AI Professor Toby Walsh last year.
The two main reasons supporting a ban were that robots wouldn’t be able to follow the laws of conflict and would kill indiscriminately and with no sense of proportion, and that they should never have the agency to decide if a person should or shouldn’t die, according to Professor Baker.
An argument against one was that killer robots could potentially save lives as well as eliminate them.
“For example, there is the claim that robots can be sent to do ‘dull, dangerous and dirty’ jobs without having to risk a human soldier, sailor or aviator – far better for a machine to get destroyed than for a member of the armed forces to be killed or maimed,” the ethicist said.
“They also argue that LAWS will be less prone to using indiscriminate force, because they don’t get scared, angry or confused in the way that human combatants can in the midst of combat.”
Do you know more? Contact James Riley via Email.