Researcher Estimates 99.9 Percent Chance AI Will Destroy Humankind
Briefly

If we create general superintelligences, I don't see a good outcome long term for humanity. The only way to win this game is not to play it.
Superintelligence will come up with something completely new, completely super. We may not even recognize that as a possible path to achieve the goal of ending everyone.
The chances of AI doing just that may not reach 100 percent - but it could get pretty close.
Read at Futurism
[
add
]
[
|
|
]