Killer robots should be banned before they can be developed, the United Nations has been told.
Current international law means that military commanders that use fully autonomous weapons would escape liability for deaths caused, a report by Harvard Law School says.
The report – called The Lack of Accountability for Killer Robots – was co-authored by Human Rights Watch and released ahead of a UN meeting in Geneva on 13 April.
The current generation of drones always has an individual overseeing their actions and deciding whether to fire a missile, a clause used to defend them by the likes of the Ministry of Defence.
But newer machines may be able to select their own targets, a move which concerns some human rights campaigners and legal experts.
The report says: “Fully autonomous weapons do not yet exist but technology is moving in their direction, and precursors are already in use or development.
“For example, many countries use weapons defence systems – such as the Israeli Iron Dome and the US Phalanx and C-RAM – that are programmed to respond automatically to threats from incoming munitions.”
It adds: “The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position.
“On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.”
The report authors call for the development and production of such weapons to be banned internationally.
Source: Sky News