Existential and global catastrophic risks
Why work on reducing existential and catastrophic risks?
Existential and global catastrophic risk research focuses on understanding and mitigating the likelihood of events that could damage well-being on a global scale, potentially even causing the extinction of humanity or a drastic and terminal curbing of its potential.
Prioritising reducing these risks may sometimes be informed by an ethical stance known as longtermism. Longtermism states that because the likely number of our descendents and the potential for their flourishing in the aeons to come is so astronomically high, we must strongly prioritise not squandering it.
Until recently, existential threats have mainly come from the natural world – supervolcanoes, asteroids, or naturally-occuring pandemics. These are still comparatively very impactful areas to research.
Currently, the greatest threats are likely ones we might bring about ourselves – ‘anthropogenic’ risks. Nuclear war and an ensuing nuclear winter or particularly severe climate change scenarios are some of the most salient examples, and excellent areas for research. However it seems the very greatest threats likely come from engineered pandemics and the possibility of advanced artificial intelligence being developed which is misaligned with human values. Given the huge commercial incentives on the development of the relevant technologies, the impossibility of perfect global regulation, and their potential necessity in solving other major problems, hoping to just prevent all progress in these fields seems unrealistic. We need our best minds at the table shaping their trajectory, which as it stands is a pretty grim one. In Toby Ord’s 2020 analysis, these two risks sharply increased the aggregated risk that we suffer an existential or catastrophic event in the next century to a sobering 1 in 6.
So some of our greatest threats are these nascent and quickly-developing technologies about which there is still so much to learn – clearly, this places huge importance on thoughtful research. And this may in fact be the most significant time to be doing it. Being new, there are many technical questions to resolve regarding these technologies, and the social norms and policy surrounding them are still early-stage and malleable.
Contributors: This introduction was last updated 21/01/2023. Thanks to Will Fenning for originally writing this introduction. Learn more about how we create our profiles.