FANDOM


An existential risk is a risk that is both global (affects all of humanity) and terminal (destroys or irreversibly cripples the target). Nick Bostrom defines an existential risk as a risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential."

If an existential risk comes to destroy a civilisation, the entire course of history is changed. If a civilisation was flourishing and spreading wellbeing throughout the universe, then there is tremendous utility in keeping it alive. If a civilisation is causing suffering in all the sentient beings with which it comes in contact, then arguably the opposite is true. Which of these better characterizes humanity is an open question. Even more important to utilitarians is which descriptor will better characterize humans of the future. After all, if there is a catastrophe, it is these future humans that will not be born.

Should we prevent existential risks? Edit

This depends on whether humans and animals have, on balance, a positive future. One concern with preventing existential risk is that post-humans would go on to multiply wild-animal suffering.

Another reason to be concerned about existential risk is be the Astronomical Waste argument: that an enormous amount of future good lives (or experiences) would not come to exist.

Taxonomy Edit

Taken verbatim from Nick Bostrom's "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards" (2002)

  • Bangs – Earth-originating intelligent life goes extinct in relatively sudden disaster resulting from either an accident or a deliberate act of destruction.
  • Crunches – The potential of humankind to develop into posthumanity is permanently thwarted although human life continues in some form.
  • Shrieks – Some form of posthumanity is attained but it is an extremely narrow band of what is possible and desirable.
  • Whimpers – A posthuman civilization arises but evolves in a direction that leads gradually but irrevocably to either the complete disappearance of the things we value or to a state where those things are realized to only a minuscule degree of what could have been achieved.

Bangs Edit

  • Deliberate misuse of nanotechnology
  • Nuclear holocaust
  • We’re living in a simulation and it gets shut down
  • Badly programmed superintelligence
  • Genetically engineered biological agent
  • Accidental misuse of nanotechnology (“gray goo”)
  • Something unforeseen
  • Physics disasters
    • Naturally occurring disease
    • Asteroid or comet impact
    • Runaway global warming

Crunches Edit

  • Resource depletion or ecological destruction
  • Misguided world government or another static social equilibrium stops technological progress
  • “Dysgenic” pressures
  • Technological arrest
  • Something unforeseen

Shrieks Edit

  • Take-over by a transcending upload
  • Flawed superintelligence
  • Repressive totalitarian global regime
  • Something unforeseen

Whimpers Edit

  • Our potential or even our core values are eroded by evolutionary development
  • Killed by an extraterrestrial civilization
  • Something unforeseen

Further Reading Edit

Existential Risks[1]: The original paper coining the term. Gives an overview of the possible risks, biases which might cause us to underestimate them and policy implications.

Astronomical Waste[2]:Gives an argument that utilitarians should minimize existential risks, on the grounds of the large number of happy lives that could be sustained in the observable universe.

Videos: Nick Bostrom on existential risk Nick Bostrom's TED talk Martin Rees' TED talk

Template:Practice

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.