Leading AI researchers, scientists call for a ban on autonomous weapons

27.07.2015
Ban autonomous offensive weapons before they start an arms race or a war: That's the demand of the artificial intelligence and robotics researchers who joined more than 1800 people in signing on to an open letter published Monday.

Stephen Hawking, Tesla founder Elon Musk and Apple co-founder Steve Wozniak are among the signatories of the letter published at the International Joint Conference on Artificial Intelligence, which runs through Thursday in Buenos Aires.

They warned that the creation of weapons that select and destroy targets autonomously, without human intervention, is likely to start a new global arms race -- something that could happen within a few years, given the current state of AI research.

Some consider the creation of autonomous weapons to be the third revolution in warfare, after the inventions of gunpowder and nuclear bombs. Such weapons might include armed quadcopters able to search for and eliminate people meeting certain pre-defined criteria. Existing weapons such as cruise missiles or remotely piloted drones are not considered autonomous because humans choose their targets, so the development of artificial intelligence software to make those decisions is the only obstacle to the creation of autonomous weapons.

If we accept that wars must be waged, then there are pros and cons to the use of autonomous weapons. Replacing human soldiers with machines reduces casualties for the machines' owners -- but lowers the threshold for going to battle, the letter said.

The key question humanity faces today is whether to start a global AI arms race or to prevent it from starting, the scientists wrote, warning that it will only take one major military power developing AI weapons to make a global arms race almost inevitable.

"The endpoint of this technological trajectory is obvious: Autonomous weapons will become the Kalashnikovs of tomorrow," they wrote.

With no need for the costly, hard-to-obtain radioactive isotopes used in nuclear weapons, autonomous weapons will be easy and cheap to mass-produce, and will inevitably end up on the black market and in the hands of terrorists, dictators and warlords, they warned, adding that such weapons will be ideal for assassinations, destabilizing nations, subduing populations and ethnic cleansing.

"We therefore believe that a military AI arms race would not be beneficial for humanity," the signatories said, concluding that it "should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."

The full text of the letter, and its signatories, can be found on the website of the Boston-based Future of Life Institute, a think-tank that says it is working to mitigate existential risks facing humanity. Its founders include Skype co-founder Jaan Tallinn, and its scientific advisory board includes Hawking, Musk and actors Alan Alda and Morgan Freeman.

Peter Sayer covers general technology breaking news for IDG News Service, with a special interest in open source software and related European intellectual property legislation. Send comments and news tips to Peter at peter_sayer@idg.com.

Peter Sayer