GLOBAL PRIORITIES INSTITUTE

Foundational academic research on how to do the most good.

UNIVERSITY OF OXFORD

The Global Priorities Institute is an interdisciplinary research centre at the University of Oxford.

Our aim is to conduct foundational research that informs the decision-making of individuals and institutions seeking to do as much good as possible. We use the tools of multiple academic disciplines, especially philosophy, economics and psychology, to explore the issues at stake.

We prioritise projects whose contributions are unlikely to be otherwise made by the normal run of academic research, and that speak directly to the most crucial considerations such an actor must confront.

How cost-effective are efforts to detect near-Earth-objects? – Toby Newberry (Future of Humanity Institute, University of Oxford)

Near-Earth-objects (NEOs) include asteroids and comets with orbits that bring them into close proximity with Earth. NEOs are well-known to have impacted Earth in the past, sometimes to catastrophic effect…

Read More

Do not go gentle: why the Asymmetry does not support anti-natalism – Andreas Mogensen (Global Priorities Institute, Oxford University)

According to the Asymmetry, adding lives that are not worth living to the population makes the outcome pro tanto worse, but adding lives that are well worth living to the population does not make the outcome pro tanto better. It has been argued that the Asymmetry entails the desirability of human extinction. However, this argument rests on a misunderstanding of the kind of neutrality attributed to the addition of lives worth living by the Asymmetry. A similar misunderstanding is shown to underlie Benatar’s case for anti-natalism.

Read More

Existential risks from a Thomist Christian perspective – Stefan Riedener (University of Zurich)

Let’s say with Nick Bostrom that an ‘existential risk’ (or ‘x-risk’) is a risk that ‘threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development’ (2013, 15). There are a number of such risks: nuclear wars, developments in biotechnology or artificial intelligence, climate change, pandemics, supervolcanos, asteroids, and so on (see e.g. Bostrom and Ćirković 2008). …

Read More