Home WebMail Friday, November 1, 2024, 10:28 AM | Calgary | -5.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Posted: 2017-03-09T20:26:10Z | Updated: 2017-03-09T21:42:15Z Is An AI Arms Race Inevitable? | HuffPost
Unlike nuclear weapons, this new class can potentially target by traits like race or even by what people have liked on social media.
|
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
Open Image Modal
Replicas of a North Korean Scud-B missile (C) and South Korean Hawk surface-to-air missiles are displayed at the Korean War Memorial in Seoul on March 6, 2017.
JUNG YEON-JE via Getty Images

By Ariel Conn

AI Arms Race Principle: An arms race in lethal autonomous weapons should be avoided.*

Perhaps the scariest aspect of the Cold War was the nuclear arms race. At its peak, the U.S. and Russia held over 70,000 nuclear weapons , only a fraction of which could have killed every person on Earth. As the race to create increasingly powerful artificial intelligence accelerates, and as governments increasingly test AI capabilities in weapons, many AI experts worry that an equally terrifying AI arms race may already be under way.

In fact, at the end of 2015, the Pentagon requested $12-$15 billion for AI and autonomous weaponry for the 2017 budget, and the deputy defense secretary at the time, Robert Work, admitted that he wanted “our competitors to wonder what’s behind the black curtain.” Work also said that the new technologies were “aimed at ensuring a continued military edge over China and Russia.”

But the U.S. does not have a monopoly on this technology, and many fear that countries with lower safety standards could quickly pull ahead. Without adequate safety in place, autonomous weapons could be more difficult to control, create even greater risk of harm to innocent civilians, and more easily fall into the hands of terrorists, dictators, reckless states, or others with nefarious intentions.

Anca Dragan , an assistant professor at UC Berkeley, described the possibility of such an AI arms race as “the equivalent of very cheap and easily accessible nuclear weapons.”

“And that would not fare well for us,” Dragan added.

Unlike nuclear weapons, this new class of WMD can potentially target by traits like race or even by what people have liked on social media.

Lethal Autonomous Weapons

Toby Walsh , a professor at UNSW Australia, took the lead on the 2015 autonomous weapons open letter , which calls for a ban on lethal autonomous weapons and has been signed by over 20,000 people. With regard to that letter and the AI Arms Race Principle, Walsh explained:

“One reason that I got involved in these discussions is that there are some topics I think are very relevant today, and one of them is the arms race that’s happening amongst militaries around the world already, today. This is going to be very destabilizing. It’s going to upset the current world order when people get their hands on these sorts of technologies. It’s actually stupid AI that they’re going to be fielding in this arms race to begin with and that’s actually quite worrying that it’s technologies that aren’t going to be able to distinguish between combatants and civilians, and aren’t able to act in accordance with international humanitarian law, and will be used by despots and terrorists and hacked to behave in ways that are completely undesirable. And that’s something that’s happening today.”

When asked about his take on this Principle, University of Montreal professor Yoshua Bengio pointed out that he had signed the autonomous weapons open letter, which basically “says it all” about his concerns of a potential AI arms race.

Details and Definitions

In addition to worrying about the risks of a race, Dragan also expressed a concern over “what to do about it and how to avoid it.”

“I assume international treaties would have to occur here,” she said.

Dragan’s not the only one expecting international treaties. The U.N. recently agreed to begin formal discussions that will likely lead to negotiations on an autonomous weapons ban or restrictions. However, as with so many things, the devil will be in the details.

In reference to an AI arms race, Cornell professor Bart Selman stated, “It should be avoided.” But he also added, “There’s a difference between it ‘should’ be avoided and ‘can’ it be avoided that may be a much harder question.”

Selman would like to see “the same kinds of discussions as there were around atomic weapons or biological weapons, where people actually start to look at the tradeoffs and the risks of an arms race.”

“That discussion has to be had,” he said, “and it may actually bring people together in a positive way. Countries could get together and say this is not a good development and we should limit it and avoid it. So to bring it out as a principle, I think the main value there is that we need to have the discussion as a society and with other countries.”

Dan Weld , a professor at the University of Washington, also worries that simply saying an arms race should be avoided is insufficient.

“I fervently hope we don’t see an arms race in lethal autonomous weapons,” Weld explained. “That said, this principle bothered me, because it doesn’t seem to have any operational form. Specifically, an arms race is a dynamic phenomenon that happens when you’ve got multiple agents interacting. It takes two people to race. So whose fault is it if there is a race? I’m worried that both participants will point a finger at the other and say, ‘Hey, I’m not racing! Let’s not have a race, but I’m going to make my weapons more accurate and we can avoid a race if you just relax.’ So what force does the principle have?”

General Consensus

Though preventing an AI arms race may be tricky, there seems to be general consensus that a race would be bad and should be avoided.

“Weaponized AI is a weapon of mass destruction and an AI arms race is likely to lead to an existential catastrophe for humanity,” said Roman Yampolskiy , a professor at the University of Louisville.

Kay Firth-Butterfield , the Executive Director of AI-Austin.org, explained, “Any arms race should be avoided but particularly this one where the stakes are so high and the possibility of such weaponry, if developed, being used within domestic policing is so terrifying.”

But Stanford professor Stefano Ermon may have summed it up best when he said, “Even just with the capabilities we have today it’s not hard to imagine how [AI] could be used in very harmful ways. I don’t want my contributions to the field and any kind of techniques that we’re all developing to do harm to other humans or to develop weapons or to start wars or to be even more deadly than what we already have.”

What do you think?

Is an AI arms race inevitable? How can it be prevented? Can we keep autonomous weapons out of the hands of dictators and terrorists? How can companies and governments work together to build beneficial AI without allowing the technology to be used to create what could be the deadliest weapons the world has ever seen?

This article is part of a weekly series on the 23 Asilomar AI Principles .

The Principles offer a framework to help artificial intelligence benefit as many people as possible. But, as AI expert Toby Walsh said of the Principles, “Of course, it’s just a start. … a work in progress.” The Principles represent the beginning of a conversation, and now we need to follow up with broad discussion about each individual principle. You can read the weekly discussions about previous principles here .

*The AI Arms Race Principle specifically addresses lethal autonomous weapons. Later in the series, we’ll discuss the Race Avoidance Principle which will look at the risks of companies racing to creating AI technology.

Your Support Has Never Been More Critical

Other news outlets have retreated behind paywalls. At HuffPost, we believe journalism should be free for everyone.

Would you help us provide essential information to our readers during this critical time? We can't do it without you.

Support HuffPost