Join now
Log in Join

Superintelligent AI Could Wipe Out Humanity

Superintelligent AI Could Wipe Out Humanity, If We're Not Ready for It

Protected content

“Success in creating AI would be the biggest event in human history,” wrote Hawking. “Unfortunately, it might also be the last, unless we learn how to avoid the risks.”

It could eventually take over resources that we depend on to stay alive, or it could consider us enough of a danger to its task completion that it decides the best course is to remove us from the picture.

Whatever specific task an intelligent machine is set to do, it’s reasonable to expect it could accomplish it better by working to protect itself from real-world threats. Hence Dewey’s invocation of “extinction by side-effect”: A super-intelligent AI may not try to drive humanity to extinction simply because it's smarter and better fit to rule, but because it is merely completing its basic functions.

“A super-intelligent AI—if it turns its power to gathering resources or protecting itself, would have an immense impact on the world,” he said. “It could co-opt our existing infrastructure, or could invent techniques and technologies we don't yet know how to make, like general-purpose nanotechnology. It could eventually take over resources that we depend on to stay alive, or it could consider us enough of a danger to its task completion that it decides the best course is to remove us from the picture. Either one of those scenarios could result in human extinction.”

World Forum