A cura di @Jeby.
Secondo Elon Musk lo sviluppo incontrollato della AI potrebbe porre gravi rischi per l’umanità, potenzialmente peggiori rispetto a quelli legati agli ordigni nucleari. Lo ha dichiarato alla South by Southwest Tech Conference in Austin, e non è la prima volta che parla di questo argomento.
“I am not normally an advocate of regulation and oversight — I think one should generally err on the side of minimizing those things — but this is a case where you have a very serious danger to the public,” said Musk.
“It needs to be a public body that has insight and then oversight to confirm that everyone is developing AI safely. This is extremely important. I think the danger of AI is much greater than the danger of nuclear warheads by a lot and nobody would suggest that we allow anyone to build nuclear warheads if they want. That would be insane,” he said at SXSW.
“And mark my words, AI is far more dangerous than nukes. Far. So why do we have no regulatory oversight? This is insane.”