#ilya-sutskever

[ follow ]
#openai
Artificial intelligence
fromFuturism
11 hours ago

OpenAI's Top Scientist Wanted to "Build a Bunker Before We Release AGI"

Ilya Sutskever envisions a world where AGI's emergence triggers extreme protective measures, underscoring serious risks perceived by OpenAI leadership.
Artificial intelligence
fromNew York Post
3 days ago

OpenAI co-founder wanted to build doomsday bunker to protect company scientists from 'rapture': book

OpenAI co-founder proposed a bunker for researchers amid fears of chaos following the release of advanced AI.
Artificial intelligence
fromFuturism
11 hours ago

OpenAI's Top Scientist Wanted to "Build a Bunker Before We Release AGI"

Ilya Sutskever envisions a world where AGI's emergence triggers extreme protective measures, underscoring serious risks perceived by OpenAI leadership.
Artificial intelligence
fromNew York Post
3 days ago

OpenAI co-founder wanted to build doomsday bunker to protect company scientists from 'rapture': book

OpenAI co-founder proposed a bunker for researchers amid fears of chaos following the release of advanced AI.
Artificial intelligence
fromTechCrunch
1 month ago

OpenAI co-founder Ilya Sutskever's Safe Superintelligence reportedly valued at $32B | TechCrunch

SSI has raised $2 billion, reaching a valuation of $32 billion, focused on developing safe superintelligence.
[ Load more ]