If AGI arrives during Trump's next term, 'none of the other stuff matters'
Briefly

The open letter generated tremendous publicity for the case that AGI could spiral out of human control unless safeguards were in place before they were actually needed. It was one of many initiatives from Future of Life Institute designed to cultivate a conversation around AI's risks and the best ways to steer the technology in a responsible direction.
My goal with spearheading that pause letter was not that I thought that there was going to be a pause. The goal was to mainstream the conversation and make it socially safe for people to voice their concerns. I actually feel that's been a massive success.
Read at Fast Company
[
|
]