Home WebMail Friday, November 1, 2024, 07:32 AM | Calgary | -4.0°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2017-08-20T14:31:38Z | Updated: 2017-08-21T15:02:24Z

By Ariel Conn

Founders of AI/robotics companies, including Elon Musk (Tesla, SpaceX, OpenAI) and Demis Hassabis and Mustafa Suleyman (Googles DeepMind), call for autonomous weapons ban, as the United Nations delays negotiations.

Leaders from AI and robotics companies around the world have released an open letter calling on the United Nations to ban autonomous weapons , often referred to as killer robots.

Founders and CEOs of nearly 100 companies from 26 countries signed the letter, which warns:

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend.

In December, 123 member nations of the U.N. had agreed to move forward with formal discussions about autonomous weapons, with 19 members already calling for an outright ban. However, the next stage of discussions, which were originally scheduled to begin on Aug. 21 the release date of the open letter were postponed because a small number of nations hadnt paid their fees.

The letter was organized and announced by Toby Walsh, a prominent AI researcher at the University of New South Wales in Sydney, Australia. In an email, he noted that, sadly, the U.N. didnt begin today its formal deliberations around lethal autonomous weapons. Walsh continued:

There is, however, a real urgency to take action here and prevent a very dangerous arms race. This open letter demonstrates clear concern and strong support for this from the Robotics & AI industry.

The open letter included such signatories as:

  • Elon Musk, founder of Tesla, SpaceX and OpenAI (U.S.)
  • Demis Hassabis, founder and CEO at Googles DeepMind (U.K.)
  • Mustafa Suleyman, founder and Head of Applied AI at Googles DeepMind (U.K.)
  • Esben stergaard, founder & CTO of Universal Robotics (Denmark)
  • Jerome Monceaux, founder of Aldebaran Robotics, makers of Nao and Pepper robots (France)
  • Jrgen Schmidhuber, leading deep learning expert and founder of Nnaisense (Switzerland)
  • Yoshua Bengio, leading deep learning expert and founder of Element AI (Canada)

In reference to the signatories, the press release for the letter added: Their companies employ tens of thousands of researchers, roboticists and engineers, are worth billions of dollars and cover the globe from North to South, East to West: Australia, Canada, China, Czech Republic, Denmark, Estonia, Finland, France, Germany, Iceland, India, Ireland, Italy, Japan, Mexico, Netherlands, Norway, Poland, Russia, Singapore, South Africa, Spain, Switzerland, U.K., United Arab Emirates and USA.

Bengio explained why he signed, saying, the use of AI in autonomous weapons hurts my sense of ethics. He added that the development of autonomous weapons would be likely to lead to a very dangerous escalation, and that it would hurt the further development of AIs good applications. He concluded his statement to the Future of Life Institute saying that this is a matter that needs to be handled by the international community, similarly to what has been done in the past for some other morally wrong weapons (biological, chemical, nuclear).

Stuart Russell, another of the worlds preeminent AI researchers and founder of Bayesian Logic Inc., added:

Unless people want to see new weapons of mass destruction in the form of vast swarms of lethal microdrones spreading around the world, its imperative to step up and support the United Nations efforts to create a treaty banning lethal autonomous weapons. This is vital for national and international security.

Ryan Gariepy, founder & CTO of Clearpath Robotics was the first to sign the letter. For the press release, he noted, Autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability.

The open letter ends with similar concerns. It states:

These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandoras box is opened, it will be hard to close. We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.

The letter was announced in Melbourne, Australia at the International Joint Conference on Artificial Intelligence (IJCAI), which draws many of the worlds top artificial intelligence researchers. Two years ago, at the last IJCAI meeting, Walsh released another open letter , which called on countries to avoid engaging in an AI arms race. To date, that previous letter has been signed by over 20,000 people, including over 3,100 AI/robotics researchers.

Your Support Has Never Been More Critical

Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

Would you help us provide essential information to our readers during this critical time? We can't do it without you.

You've supported HuffPost before, and we'll be honest we could use your help again . We view our mission to provide free, fair news as critically important in this crucial moment, and we can't do it without you.

Whether you give once or many more times, we appreciate your contribution to keeping our journalism free for all.

You've supported HuffPost before, and we'll be honest we could use your help again . We view our mission to provide free, fair news as critically important in this crucial moment, and we can't do it without you.

Whether you give just one more time or sign up again to contribute regularly, we appreciate you playing a part in keeping our journalism free for all.

Support HuffPost

Read the letter here .