Harvard and Human Rights Watch Call For A Ban On “Killer Robots” Amid Geneva Talks

plane2-1024x682By Joseph Jankowski

A new report from the Harvard Law School International Human Rights Clinic and Human Rights Watch is calling for a ban on lethal autonomous weapons, fearing the implications of handing life or death situations over to “killer robots.”

“Machines have long served as instruments of war, but historically humans have directed how they are used,” senior arms division researcher at Human Rights Watch, Bonnie Docherty, said in a statement.

Now there is a real threat that humans would relinquish their control and delegate life-and-death decisions to machines.

The report suggests that states should adopt an international, legally binding instrument that prohibits the development, production, and use of fully autonomous weapons.

It also calls for national laws or policies that establish similar prohibitions on fully autonomous weapons.

“A requirement to maintain human control over the use of weapons would eliminate many of the problems associated with fully autonomous weapons,” states the report. “Such a requirement would protect the dignity of human life, facilitate compliance with international humanitarian and human rights law, and promote accountability for unlawful acts.”

This week, member states of the United Nations are meeting in Geneva to discuss the legality of lethal autonomous weapons.

The talks are focused on understanding and controlling the way artificial intelligence technology is improving, and the implications that could have for weapons like Predator and Reaper drones.

Concern over the implications of handing robots the ability to kill has been shared by many experts, including Alan Winfield, an electronic engineer at the University of the West of England.

“It means that humans are deprived from moral responsibility,” Winfield addressed those meeting at Davos in January. “When you put a robot in a chaotic environment, it behaves chaotically.”

Roger Carr, chairman of the British aerospace and defense group BAE, also believes autonomous killer robots pose a great danger to humans.

“If you remove ethics and judgment and morality from human endeavor whether it is in peace or war, you will take humanity to another level which is beyond our comprehension,” Carr warned at Davos. “You equally cannot put something into the field that, if it malfunctions, can be very destructive with no control mechanism from a human. That is why the umbilical link, man to machine, is not only to decide when to deploy the weapon but it is also the ability to stop the process. Both are equally important.”

Just to get a glance into where lethal autonomous weapons currently stand, let’s take a look at the “robotic war balls” being developed by the Marines.

From Defense One:

A research team from Stamford, Conn. has developed an amphibious drone that they are currently testing with the Marines. The GuardBot is a robot ball that swims over water at about 4 miles per hour and then rolls along the beach, at as much as a 30-degree incline and 20 miles per hour.

It uses a nine-axis stabilization, “pendulum motion” propulsion system, which moves the bot forward by shifting the center of gravity back and forth and a variety of steering algorithms.

It took creator Peter Muhlrad some seven years to develop, but now that it’s complete Muhlrad says it can be rapidly produced in various sizes. Company documents suggest it can be scaled down to units as small as 10 cm and as large as nine feet. The company is planning to develop a prototype that’s 6 feet in diameter.

Muhlrad’s company, GuardBot Inc. has a cooperative research development agreement, or CRADA, with the Navy. A CRADAis a legal framework that allows private companies or researchers to use government facilities, research and resources to build things that are mutually beneficial to both parties. The information that the researcher discovers is protected for up to five years. Under many CRADAs the researcher does not receive money from the government but has the right to commercialize what he or she produces. The government retains a use license.

According to Peter Muhlad, the GuardBot could carry explosives.

Also read:

Joseph Jankowski is a contributor for Planet Free Will.com. His works have been published by recognizable alternative news sites like GlobalResearch.ca, ActivistPost.com and Intellihub.com

Follow Planet Free Will on Twitter @ twitter.com/PlanetFreeWill


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Harvard and Human Rights Watch Call For A Ban On “Killer Robots” Amid Geneva Talks"

Leave a comment