Future Suffering and the Non-Identity Problem
Theron Pummer (University of St Andrews)
GPI Working Paper No. 17-2024
I present and explore a new version of the Person-Affecting View, according to which reasons to do an act depend wholly on what would be said for or against this act from the points of view of particular individuals. According to my view, (i) there is a morally requiring reason not to bring about lives insofar as they contain suffering (negative welfare), (ii) there is no morally requiring reason to bring about lives insofar as they contain happiness (positive welfare), but (iii) there is a permitting reason to bring about lives insofar as they contain happiness. I show how my view solves the non-identity problem, while retaining the procreation asymmetry and avoiding implausible forms of antinatalism. We can be morally required to ensure that the quality of life of future people is higher rather than lower when this involves bringing about (worth living) lives that would contain less suffering rather than bringing about different (worth living) lives that would contain more suffering.
Theron Pummer gave the Parfit Memorial Lecture 2024, Future Suffering and the Non-Identity Problem, on 12 June 2024.
Other working papers
The Shutdown Problem: An AI Engineering Puzzle for Decision Theorists – Elliott Thornley (Global Priorities Institute, University of Oxford)
I explain and motivate the shutdown problem: the problem of designing artificial agents that (1) shut down when a shutdown button is pressed, (2) don’t try to prevent or cause the pressing of the shutdown button, and (3) otherwise pursue goals competently. I prove three theorems that make the difficulty precise. These theorems suggest that agents satisfying some innocuous-seeming conditions will often try to prevent or cause the pressing of the shutdown button, even in cases where it’s costly to do so. I end by noting that…
Existential risks from a Thomist Christian perspective – Stefan Riedener (University of Zurich)
Let’s say with Nick Bostrom that an ‘existential risk’ (or ‘x-risk’) is a risk that ‘threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development’ (2013, 15). There are a number of such risks: nuclear wars, developments in biotechnology or artificial intelligence, climate change, pandemics, supervolcanos, asteroids, and so on (see e.g. Bostrom and Ćirković 2008). …
The asymmetry, uncertainty, and the long term – Teruji Thomas (Global Priorities Institute, Oxford University)
The Asymmetry is the view in population ethics that, while we ought to avoid creating additional bad lives, there is no requirement to create additional good ones. The question is how to embed this view in a complete normative theory, and in particular one that treats uncertainty in a plausible way. After reviewing…