Skip To Content
BuzzFeed News Home Reporting To You

Utilizamos cookies, próprios e de terceiros, que o reconhecem e identificam como um usuário único, para garantir a melhor experiência de navegação, personalizar conteúdo e anúncios, e melhorar o desempenho do nosso site e serviços. Esses Cookies nos permitem coletar alguns dados pessoais sobre você, como sua ID exclusiva atribuída ao seu dispositivo, endereço de IP, tipo de dispositivo e navegador, conteúdos visualizados ou outras ações realizadas usando nossos serviços, país e idioma selecionados, entre outros. Para saber mais sobre nossa política de cookies, acesse link.

Caso não concorde com o uso cookies dessa forma, você deverá ajustar as configurações de seu navegador ou deixar de acessar o nosso site e serviços. Ao continuar com a navegação em nosso site, você aceita o uso de cookies.

Elon Musk, Stephen Hawking Want To Save Us From The Terminators

No fate but what you make: Leading artificial intelligence researchers plead for a ban on autonomous weapons.

Posted on July 27, 2015, at 2:16 p.m. ET

Paramount Pictures

First came gunpowder. Then, nuclear bombs. The next evolution of warfare, experts fear, will be defined by robots -- or more precisely, the absence of human decision-making.

In an open letter presented Monday at the International Joint Conference on Artificial Intelligence in Buenos Aires, a who's who of robotics and AI researchers call for a ban on autonomous weapons systems, arguing that the current trajectory of weapons science and recent legal arguments used by the United States to justify novel forms of extrajudicial killing may herald an abysmal and dystopian future.

"If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable," states the letter, published by the Future of Life Institute and co-signed by SpaceX and Tesla founder Elon Musk, theoretical physicist Stephen Hawking, and MIT professor Noam Chomsky, as well as Steve Wozniak, the co-founder of Apple, and Demis Hassabis of Google DeepMind.

Musk has previously described AI as "potentially more dangerous than nukes," worrying that by unleashing the power of robots, human beings will act as a "biological boot loader for digital super-intelligence." These comments echo those of Hawking, who told the BBC last year that "artificial intelligence could spell the end of the human race."

The letter states that unlike nuclear arms, which require enormous financial investment and rare raw materials to develop, automated weapons are relatively cheap, lending themselves to widespread manufacture. The experts argue that these weapons may become so ubiquitous that it's only a matter of time before they are sold on the black market, offering up robot killers to dictators, warlords, and terrorists.

"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity," the letter explains.

While automated weapons might reduce human causalities for the aggressor, proponents argue, they also lower the threshold for waging war. What's more, such weapons could also negatively impact future AI research, the letter's signatories say.

"Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits."

A BuzzFeed News investigation, in partnership with the International Consortium of Investigative Journalists, based on thousands of documents the government didn't want you to see.