Monday, November 13, 2017

Killer Robots

Technology now exists to create autonomous weapons that can select and kill human targets without supervision. The military has been one of the largest funders and adopters of artificial intelligence technology. The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can. An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away.

Stuart Russell, a leading AI scientist at the University of California in Berkeley, warned that the manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast. While military drones have long been flown remotely for surveillance and attacks, autonomous weapons armed with explosives and target recognition systems are now within reach and could locate and strike without deferring to a human controller. Because AI-powered machines are relatively cheap to manufacture, critics fear that autonomous weapons could be mass produced and fall into the hands of rogue nations or terrorists who could use them to suppress populations and wreak havoc.

“Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,” Russell said. Scientists used a similar argument to convince presidents Lyndon Johnson and Richard Nixon to renounce the US biological weapons programme and ultimately bring about the Biological Weapons Convention. In August, more than 100 of the world’s leading robotics and AI pioneers called on the UN to ban the development and use of killer robots. The open letter, signed by Tesla’s chief executive, Elon Musk, and Mustafa Suleyman, the founder of Alphabet’s Deep Mind AI unit, warned that an urgent ban was needed to prevent a “third revolution in warfare”, after gunpowder and nuclear arms. So far, 19 countries have called for a ban, including Argentina, Egypt and Pakistan.

Noel Sharkey, the emeritus professor of AI at Sheffield University and chair of the International Committee on Robot Arms Control, who warned about the dangers of autonomous weapons 10 years ago explained, “There is an emerging arms race among the hi-tech nations to develop autonomous submarines, fighter jets, battleships and tanks that can find their own targets and apply violent force without the involvement of meaningful human decisions. It will only take one major war to unleash these new weapons with tragic humanitarian consequences and destabilisation of global security.”

In 2015, the UK government opposed an international ban on killer robots. The Foreign Office said it saw no need for the prohibition as international humanitarian law already regulated the area. According to the Campaign to Stop Killer Robots, a number of nations, including the US, China, Russia, Israel, South Korea, and the United Kingdom are moving toward systems that would give “greater combat autonomy” to machines.

https://www.theguardian.com/science/2017/nov/13/ban-on-killer-robots-urgently-needed-say-scientists

No comments: