A paradox for tiny probabilities and enormous values

Nick Beckstead (Open Philanthropy Project) and Teruji Thomas (Global Priorities Institute, Oxford University)

GPI Working Paper No. 7-2021, published in Noûs

We show that every theory of the value of uncertain prospects must have one of three unpalatable properties. Reckless theories recommend risking arbitrarily great gains at arbitrarily long odds for the sake of enormous potential; timid theories permit passing up arbitrarily great gains to prevent a tiny increase in risk; non-transitive theories deny the principle that, if A is better than B and B is better than C, then A must be better than C. While non-transitivity has been much discussed, we draw out the costs and benefits of recklessness and timidity when it comes to axiology, decision theory, and moral uncertainty.

Other working papers

Doomsday and objective chance – Teruji Thomas (Global Priorities Institute, Oxford University)

Lewis’s Principal Principle says that one should usually align one’s credences with the known chances. In this paper I develop a version of the Principal Principle that deals well with some exceptional cases related to the distinction between metaphysical and epistemic modal­ity. I explain how this principle gives a unified account of the Sleeping Beauty problem and chance-­based principles of anthropic reasoning…

What power-seeking theorems do not show – David Thorstad (Vanderbilt University)

Recent years have seen increasing concern that artificial intelligence may soon pose an existential risk to humanity. One leading ground for concern is that artificial agents may be power-seeking, aiming to acquire power and in the process disempowering humanity. A range of power-seeking theorems seek to give formal articulation to the idea that artificial agents are likely to be power-seeking. I argue that leading theorems face five challenges, then draw lessons from this result.

Is Existential Risk Mitigation Uniquely Cost-Effective? Not in Standard Population Models – Gustav Alexandrie (Global Priorities Institute, University of Oxford) and Maya Eden (Brandeis University)

What socially beneficial causes should philanthropists prioritize if they give equal ethical weight to the welfare of current and future generations? Many have argued that, because human extinction would result in a permanent loss of all future generations, extinction risk mitigation should be the top priority given this impartial stance. Using standard models of population dynamics, we challenge this conclusion. We first introduce a theoretical framework for quantifying undiscounted cost-effectiveness over…