A.I. or Nuclear Weapons: Can You Tell These Quotes Apart?

A.I. or Nuclear Weapons: Can You Tell These Quotes Apart?

The comparison seems to be everywhere these days. “It’s like nuclear weapons,” a pioneering artificial intelligence researcher has said. Top A.I. executives have likened their product to nuclear energy. And a group of industry leaders warned last week that A.I. technology could pose an existential threat to humanity, on par with nuclear war.

People have been analogizing A.I. advances to splitting the atom for years. But the comparison has become more stark amid the release of A.I. chatbots and A.I. creators’ calls for national and international regulation — much as scientists called for guardrails governing nuclear arms in the 1950s. Some experts worry that A.I. will eliminate jobs or spread disinformation in the short term; others fear hyper-intelligent systems could eventually learn to write their own computer code, slip the bonds of human control and, perhaps, decide to wipe us out. “The creators of this technology are telling us they are worried,” said Rachel Bronson, the president of the Bulletin of the Atomic Scientists, which tracks man-made threats to civilization. “The creators of this technology are asking for governance and regulation. The creators of this technology are telling us we need to pay attention.”

Not every expert thinks the comparison fits. Some note that the destructiveness of atomic energy is kinetic and demonstrated, whereas A.I.’s danger to humanity remains highly speculative. Others argue that almost every technology, including A.I. and nuclear energy, has upsides and risks. “Tell me a technology that cannot be used for something evil, and I’ll tell you a completely useless technology that cannot be used for anything,” said Julian Togelius, a computer scientist at N.Y.U. who works with A.I.

But the comparisons have become fast and frequent enough that it can be hard to know whether doomsayers and defenders alike are talking about A.I. or nuclear technology. Take the quiz below to see if you can tell the difference.

1 of 12

“ is the future not only of Russia but of all of mankind. There are huge opportunities, but also threats that are difficult to foresee today. Whoever becomes the leader in this sphere will become the ruler of the world.”

2 of 12

“Certainly, measures of control can be devised which will eliminate the fear of sudden, total devastation which is what we’re headed for now.”

3 of 12

“My worst fears are that we cause significant — we, the field, the technology, the industry — cause significant harm to the world.”

4 of 12

“It’s kind of our Promethean moment.”

5 of 12

“Let us realize that all we can gain in wealth, economic security or improved health will be useless if our nation is to live in continuous dread of sudden annihilation.”

6 of 12

“We have placed our civilization and our species in jeopardy. Fortunately, it is not yet too late. We can safeguard the planetary civilization and the human family if we so choose. There is no more important or more urgent issue.”

7 of 12

“If we go ahead on this, everyone will die, including children who did not choose this and did not do anything wrong.”

8 of 12

“If any major military power pushes ahead with development, a global arms race is virtually inevitable.”

9 of 12

“Nothing is riskless, but when one weighs the risks overall, the advantages of exceed the risks.”

10 of 12

“We are drifting toward a catastrophe beyond comparison.”

11 of 12

“The rise of will be either the best or the worst thing ever to happen to humanity. We do not yet know which.”

12 of 12

“I console myself with the normal excuse: If I hadn’t done it, somebody else would have.”

The above quotes are only a slice of the responses to — and the debate over — A.I. and nuclear technology. They capture parallels, but also some notable differences: the fears of imminent, fiery destruction from atomic weapons; or how advancements in A.I. right now are mostly the work of private companies rather than governments.

But in both cases, some of the same people who brought the technology into the world are sounding the alarm the loudest. “It’s about managing the risks of science’s advancement,” Ms. Bronson, the Bulletin of the Atomic Scientists president, said of A.I. “This is a huge scientific advancement that requires attention, and there’s so many lessons to learn from the nuclear space on that. And you don’t have to equate them to learn from them.”

Add a Comment