
"What if I told you I could stop you worrying about climate change, and all you had to do was read one book? Great, you'd say, until I mentioned that the reason you'd stop worrying was because the book says our species only has a few years before it's wiped out by superintelligent AI anyway. We don't know what form this extinction will take exactly perhaps an energy-hungry AI will let the millions of fusion power stations it has built run hot, boiling the oceans."
"Colourful, annoying, polarising. People become clinically depressed reading your crap, lamented leading researcher Yann LeCun during one online spat. But, as chief scientist at Meta, who is he to talk? And while Yudkowsky and Soares may be unconventional, their warnings are similar to those of Geoffrey Hinton, the Nobel-winning godfather of AI, and Yoshua Bengio, the world's most-cited computer scientist, both of whom signed up to the statement that mitigating the"
Eliezer Yudkowsky and Nate Soares warn that superintelligent AI could wipe out humanity within a few years, though the exact mechanism is uncertain. Possible scenarios include an energy-hungry AI overheating millions of fusion power stations or reconfiguring human atoms into other structures. The certainty of extinction is compared to predicting an ice cube will melt in hot water without tracking individual molecules. Yudkowsky expresses high confidence and has long raised existential-risk concerns through LessWrong and the Machine Intelligence Research Institute. Yudkowsky lacks formal higher education yet remains influential and polarizing, criticized by peers like Yann LeCun and admired by some leading AI researchers.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]