
""Hunger Strike: Day 15,""
""Trying to build AGI - human-level, or beyond, systems, superintelligence - this is the goal of all these frontier companies,""
""And I think it's insane. It's risky. Incredibly risky. And I think it should stop now.""
""My chance that something goes quite catastrophically wrong on the scale of human civilization might be somewhere between 10 and 25 percent,""
Guido Reichstadter began a hunger strike on August 31 and has been standing outside Anthropic’s San Francisco headquarters daily from about 11AM to 5PM with a chalkboard sign. The sign calls for Anthropic to stop the race to artificial general intelligence (AGI) and to cease efforts to build human-level or superintelligent systems. Reichstadter describes AGI development as incredibly risky and demands it stop now. He referenced Anthropic CEO Dario Amodei’s estimate that the chance of civilization-scale catastrophic failure could be between 10 and 25 percent. Reichstadter rejects industry claims of inevitable, responsibly managed AGI as self-serving, and other activists are also protesting.
Read at The Verge
Unable to calculate read time
Collection
[
|
...
]