United Nations to Host Conference on the Use of Autonomous Weapons

Tesla’s Elon Musk and the Campaign to Stop Killer Robots have called for a ban on the use of militarized robots and algorithms.

The United Nations is set to open its first official talks on the use of autonomous weapons, but a treaty governing so-called killer robots remains far off, the ambassador leading the discussions said Friday.

Activists and tech leaders including Tesla's Elon Musk have called on the UN to ban fully-automated weapons systems that could revolutionize warfare while putting civilians at heightened risk.

The Conference on Disarmament will on Monday open five days of talks on the weaponry, but those calling for a ban will not be satisfied, said Indian ambassador Amandeep Gill, who is chairing the meeting.

"It would be very easy to just legislate a ban but I think... rushing ahead in a very complex subject is not wise," he told reporters. "We are just at the starting line."

He said the discussion, which will also include civil society and technology companies, will be partly focused on understanding the types of weapons in the pipeline.

Proponents of a ban, including the Campaign to Stop Killer Robots pressure group, insist that human beings must ultimately be responsible for the final decision to kill or destroy.

They argue that any weapons system that delegates the decision on an individual strike to an algorithm is by definition illegal, because computers cannot be held accountable under international humanitarian law.

Gill said there was agreement among nations that "human beings have to remain responsible for decisions that involve life and death."

But, he said, there are varying opinions on the mechanics through which "human control" must govern deadly weapons.

RELATED: UN Vote Puts 'Killer Robots' on the Agenda in 2017

The International Committee of the Red Cross, which is mandated to safeguard the laws of conflict, has also not called for a ban, but has underscored the need to place limits on autonomous weapons.

"Our bottom line is that machines can't apply the law and you can't transfer responsibility for legal decisions to machines," Neil Davison of the ICRC's arms unit told AFP.

He highlighted the problematic nature of weapons systems, where there are major variables in terms of the timing or location of an attack — for example something that is deployed for multiple hours and programed to strike whenever it detects an enemy target.

"Where you have a degree of unpredictability or uncertainty in what's going to happen when you activate this weapons system then you are going to start to have problems for legal compliance," he said.

WATCH: How Do Military Drones Work?