Human Rights Watch has released a report entitled Losing Humanity which calls for the banning of completely autonomous military weapons systems.

An MQ-1 Predator unmanned aircraft
An MQ-1 Predator unmanned aircraft which has been used by the US military in Afghanistan and Turkey.

Judgement Day was meant to happen on August 29, 1997. At 2.14am Eastern Time, the machines were meant to become self-aware and launch a nuclear strike against Russia, thus triggering a global nuclear war and the beginning of the end of the human race.

This is course never happened, because the Terminator and Skynet were only horrifying visions of the future as dreamed up by James Cameron in his 1984 film.

However a real judgement day may not be that far off as governments around the world pour billions of pounds into the development of autonomous weapons systems which are slowly but surely taking control from the hands of humans.

Playing the role of Sarah Connor and attempting to warn the world of impending doom is Human Rights Watch (HRW), which has just released a report called Losing Humanity - The Case against Killer Robots.

Prohibit

The report calls for all states to "prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument."

It also wants the adoption of national laws and policies to prevent this happening and a review to be undertaken "of technologies and components that could lead to fully autonomous weapons."

It calls on roboticists and others involved in the development of robotic weapons to draft a professional code of conduct governing research and development of autonomous robotic weapons "especially those capable of becoming fully autonomous."

However, with the US government currently spending $6bn a year on autonomous weapons and their development, it is unlikely that they would be willing to throw that money away, particularly when they are espousing the benefits of these automated systems.

The main advantage of these systems is a reduction in military casualties. This is a powerful and emotive tool in trying to sell the concept to the public. It is hard to argue with something which aims to reduce the death toll of war.

Fully autonomous

While we are still some way off having fully autonomous weapons in the wild, the HRW report details precursors which are already in operation and prototypes which claim to have the capability to remove human interaction completely.

The US military has openly stated that it is seeking to reduce human influence on its weapons systems, adding that it is looking to make its ground vehicles "fully autonomous."

The UK Ministry of Defence however, stated in 2011 that it "currently has no intention to develop systems that operate without human intervention in the weapon command and control chain."

The recent conflict in Gaza has brought to the attention of the world one of these precursors, in the shape is Israel's automatic defence system known as the Iron Dome. This may be seen as a defensive system rather than an offensive one, but there are still concerns about the amount of human oversight on these systems.

In less than a second after detecting an incoming threat, the Iron Dome sends a recommended response to the threat to an operator. The operator must decide immediately whether or not to give the command to fire in order for the Iron Dome to be effective.

Robotic warfare expert Peter W. Singer has voiced concerns about the amount of human supervision these systems need:

"The human is certainly part of the decision making but mainly in the initial programming of the robot. During the actual operation of the machine, the operator really only exercises veto power, and a decision to override a robot's decision must be made in only half a second, with few willing to challenge what they view as the better judgment of the machine."

Israel is one of a number of countries which employ automated defence systems like this with the US using its Phalanx Close-In Weapons System on aircraft carriers as far back as 1980.

Unmaned aircraft

While the UK may not employ an automated defence system such as Iron Dome or Phalanx, it is heavily invested in another step on the road to fully automated weapons - unmanned aircraft.

These aircraft are moving beyond the capabilities of the predator drones currently being employed in Afghanistan by the US military.

Taranis
An image of the Taranis, described as “an autonomous and stealthy unmanned aircraft that aims to strike “targets with real precision at long range, even in another continent.”

Back in 2010, the UK's Ministry of Defence (MoD) unveiled a prototype of its Taranis combat aircraft (above) which designers described it as "an autonomous and stealthy unmanned aircraft" that aims to strike "targets with real precision at long range, even in another continent."

The MoD said the Taranis would remain what is called a human-in-the-loop weapon, meaning it would remain under human control: "Should such systems enter into service, they will at all times be under the control of highly trained military crews on the ground."

Robotics professor Noel Sharkey has his doubts about the validity of this statement: "We need to know if this means the robot planes will choose their own targets and destroy them - because they certainly will not have the intelligence to discriminate between civilians and combatants."

Other countries are not so shy about talking up the potential of their unmanned planes.

Israel's Harpy has been described as a combination of an unmanned aerial vehicle and a cruise missile. It is designed to fly "autonomously to the patrol area." Once there, it seeks to detect hostile radar signals and then destroy a target with a high explosive warhead.

Singer sums up the belief that this process is only just beginning: "Predators are merely the first generation-the equivalent of the Model T Ford or the Wright Brothers' Flyer."

Accountability

Whatever about the advancements in technology and weapons ability to do the job, the real question these weapons throw up is: who is going to take responsibility for the killing carried out by these weapons?

The report says robots with complete autonomy would be incapable of meeting international humanitarian law standards.

"The rules of distinction, proportionality, and military necessity are especially important tools for protecting civilians from the effects of war, and fully autonomous weapons would not be able to abide by those rules," the report states.

For example, distinguishing between a fearful civilian and a threatening enemy combatant requires a soldier to understand the intentions behind a human's actions, something a robot could not do.

In addition, fully autonomous weapons would likely contravene the Martens Clause, which prohibits weapons that run counter to the "dictates of public conscience."

The authors of the report believe this could lead to "emotionless robots" to serve as tools of repressive dictators seeking to crack down on their own people without fear their troops would turn on them.

"Although relying on machines to fight war would reduce military casualties-a laudable goal-it would also make it easier for political leaders to resort to force since their own troops would not face death or injury."

Think about it this way.

If a fully autonomous weapon carried out an unlawful killing, who is responsible? Options include the military commander, the programmer, the manufacturer, and even the robot itself, but none of these options is satisfactory.

The report says: "Since there is no fair and effective way to assign legal responsibility for unlawful acts committed by fully autonomous weapons, granting them complete control over targeting decisions would undermine yet another tool for promoting civilian protection."