Please enable JavaScript to experience the full functionality of GMX.

Elon Musk calls for artificial intelligence regulation

Elon Musk calls for artificial intelligence regulation

Elon Musk has called for regulation in the artificial intelligence (AI) industry before "it's too late".

The Tesla CEO thinks the new technology presents a "fundamental risk to the existence of human civilisation".

Speaking at the National Governor's Association, he said: "I have access to the very most cutting edge AI, and I think people should be really concerned about it ... [AI is] the greatest risk we face as a civilisation.

"AI's a rare case where we need to be proactive in regulation, instead of reactive. Because by the time we are reactive with AI regulation, it's too late.

"AI is a fundamental risk to the existence of human civilisation, in a way that car accidents, airplane crashes, faulty drugs, or bad food were not. I'm against overregulation for sure. But man, I think with we've got to get on that with AI, pronto."

And Musk worries there will come a time where robots can do "everything".

He added: "When I say everything, the robots will do everything, bar nothing ... [They] could start a war by doing fake news and spoofing email accounts and fake press releases, and just by manipulating information. The pen is mightier than the sword."

However, Oren Etzioni, computer science professor at University of Washington and CEO of the Allen Institute for Artificial Intelligence, claims Musk is "obsessed" with AI and distracting from the real concerns.

He said: "Elon Musk's obsession with AI as an existential threat for humanity is a distraction from the real concern about AI's impact on jobs and weapons systems.

"What the public needs is good information about the actual consequences of AI both positive and negative. We have to distinguish between science and science fiction."

In order to provide you with the best online experience this website uses cookies. By using our website, you agree to our use of cookies. More Info.