The development of lethal autonomous weapons systems (LAWS) that select targets without human intervention could violate fundamental principles of human dignity, according to one AI expert.
Stuart Russell, professor of computer science at the University of California, Berkeley, highlighted in the journal Nature the ethical decision faced by the artificial intelligence (AI) and robotics communities about whether to oppose or support the development of such systems.
"LAWS could violate fundamental principles of human dignity by allowing machines to choose whom to kill," Russell said. "For example, they might be tasked to eliminate anyone exhibiting 'threatening behaviour'.
"Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenceless. This is not a desirable future."
LAWS could include armed quadcopters or self-driving tanks capable of tracking and eliminating enemy combatants.
This potential to transform modern warfare so considerably has led some to describe LAWS as the third revolution in warfare, after gunpowder and nuclear arms.
The technology needed to make LAWS is just a few years away, according to Russell, with two programmes from the US Defence Advanced Research Projects Agency (DARPA) already foreshadowing planned uses of such automated systems.
Currently there are no specific provisions outlined by international humanitarian law for autonomous military weapons, however the United Nations has held meetings on the implications of LAWS and are expected to continue doing so until a decision about their future is made.
Russell provided expert testimony at the UN's Convention on Certain Conventional Weapons (CCW) in April, during which different countries took opposing stances on what measures should be taken.
While Germany said that it would "not accept that the decision over life and death is taken solely by an autonomous system", the UK, US, and Israel said that an international treaty on the use of LAWS is unnecessary.
"Debates should be organized at scientific meetings; arguments studied by ethics committees; position papers written for society publications; and votes taken by society members," Russell said. "Doing nothing is a vote in favour of continued development and deployment."