Which would you prefer: death by supercomputer, or nuclear annihilation? While there isn't yet a historical precedent for a genocidal AI, tech billionaire Elon Musk is suggesting that sophisticated artificial intelligences may eventually be just as dangerous as atomic weaponry.
Musk's fears seem to sprout from a reading of Superintelligence: Paths, Dangers and Strategies by Nick Bostrom. "Worth reading Superintelligence by Bostrom," tweeted Musk. "We need to be super careful with AI. Potentially more dangerous than nukes."
Of course, as The Terminator franchise has taught us, fear of a marauding AI and that of a nuclear holocaust need not be mutually exclusive. So should we expect a more AI-supportive stance from Musk, considering his influential position within the tech industry? He does of course have his own stakes in AI startup Vicarious, so he's not exactly John Connor himself. Or is the Tesla leader justified in his worries? The last thing he needs is HAL going mental on one of his SpaceX flights. [@elonmusk]