Longtermism and its Limits
Briefly

Longtermism and its Limits
"We are currently facing a constellation of existential risks. Climate change may wipe us out. Or perhaps nuclear war or another global pandemic. But even if we deal with those threats, there is the possibility of unaligned artificial intelligence taking over and deciding that things are better off without us. There is also, unfortunately, the non-zero probability of a giant asteroid striking Earth and causing a mass extinction event."
"Longtermists say that we should direct a significant portion of our attention and resources towards making things go best for future generations. What counts as a significant portion? Longtermists disagree on the details. But according to one variant of the view-call it strong longtermism-there is no morally relevant difference between future people and the living, and since future people vastly outnumber the living, when considering what we ought to do, the interests of future people always trump the interests of the living."
"Strong longtermism has radical implications. Much of our talk about what we owe to each other often focuses on those who are temporally close to us: the living, or perhaps the next generation, or, in rare cases, the next couple of generations. Yet if each person counts for one and only one on the long timeline of human history, then we ought to radically revise our moral and political goals. We ought to spend much more on reducing the risk of human extinction."
Humanity faces multiple existential risks including climate change, nuclear war, global pandemics, unaligned artificial intelligence, and asteroid impacts. These risks create a significant probability of human extinction if rapid action is not taken. Longtermism recommends allocating substantial attention and resources toward improving outcomes for future generations. Strong longtermism holds that future people have equal moral status to present people and vastly outnumber current populations, so their interests dominate moral decision-making. Strong longtermism implies radical shifts in moral and political priorities, prioritizing measures such as reducing extinction risk even when that requires major trade-offs with current welfare.
Read at Apaonline
Unable to calculate read time
[
|
]