EP03: X Risks


EP03: X Risks

Humanity could have an extremely long future ahead of it, potentially stretching billions of years and spreading out across the universe. All of those planets and stars and energy out there could be used for amazing projects. We might digitize human consciousness and upload ourselves onto servers where we can simulate paradise. We might end scarcity, and every person alive will have everything they could ever want. It’s basically out of our ken to imagine what future humans will come up with, but we can imagine it will be pretty great to be alive in the eons ahead. The thing is, those humans to come are depending on us for that future. Those of us alive today are entering what may be the most dangerous period in the entire span – past or future – of the human race. We are beginning to face existential risks – threats to the very existence of our species, threats that are big enough to actually drive humanity to extinction. And if we go extinct, not only do we die, but that whole bright future and all the quadrillion humans to come will be lost forever too. (Original score by Point Lobo www.pointlobo.com.)

Interviewees:

  • Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute
  • David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+)
  • Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis)
  • Toby Ord, Oxford University philosopher
  • Sebastian Farquahar, Oxford University philosopher

Get Existential Risks on a shirt, mug, sticker and more!