Metadata

Highlights

  • Longtermism is a quasi-religious worldview, influenced by transhumanism and utilitarian ethics, which asserts that there could be so many digital people living in vast computer simulations millions or billions of years in the future that one of our most important moral obligations today is to take actions that ensure as many of these digital people come into existence as possible.
  • Bostrom also takes seriously the idea that we already live in a giant computer simulation that could get shut down at any moment (yet another idea that Musk seems to have gotten from Bostrom).
  • This is, of course, straight out of the handbook of eugenics,
  • More recently, Hanson became embroiled in controversy after he seemed to advocate for “sex redistribution” along the lines of “income redistribution,” following a domestic terrorist attack carried out by a self-identified “incel.”
  • One implication of Beckstead’s view is that, to quote him, since “saving lives in poor countries may have significantly smaller ripple effects than saving and improving lives in rich countries, … it now seems more plausible to me that saving a life in a rich country is substantially more important than saving a life in a poor country, other things being equal.”
    • Note: Along the typical argument of “the tide that rises all boats”
  • Morality, in this view, is all about crunching the numbers; as the longtermist Eliezer Yudkowsky once put it, “Just shut up and multiply.”
  • it seems to have made little effort to foster diversity or investigate alternative visions of the future that aren’t Baconian, pro-capitalist fever-dreams built on the privileged perspectives of white men in the Global North.
  • more technology and morality is reduced to a computational exercise