Population ethics with thresholds
Walter Bossert (University of Montreal), Susumu Cato (University of Tokyo) and Kohei Kamaga (Sophia University)
GPI Working Paper No. 3-2025
We propose a new class of social quasi-orderings in a variable-population setting. In order to declare one utility distribution at least as good as another, the critical-level utilitarian value of the former must reach or surpass the value of the latter. For each possible absolute value of the difference between the population sizes of two distributions to be compared, we specify a non-negative threshold level and a threshold inequality. This inequality indicates whether the corresponding threshold level must be reached or surpassed in the requisite comparison. All of these threshold critical-level utilitarian quasi-orderings perform same-number comparisons by means of the utilitarian criterion. In addition to this entire class of quasi-orderings, we axiomatize two important subclasses. The members of the first subclass are associated with proportional threshold functions, and the well-known critical-band utilitarian quasi-orderings are included in this subclass. The quasi-orderings in the second subclass employ constant threshold functions; the members of this second class have, to the best of our knowledge, not been examined so far. Furthermore, we characterize the members of our class that (i) avoid the repugnant conclusion; (ii) avoid the sadistic conclusions; and (iii) respect the mere-addition principle.
Other working papers
Should longtermists recommend hastening extinction rather than delaying it? – Richard Pettigrew (University of Bristol)
Longtermism is the view that the most urgent global priorities, and those to which we should devote the largest portion of our current resources, are those that focus on ensuring a long future for humanity, and perhaps sentient or intelligent life more generally, and improving the quality of those lives in that long future. The central argument for this conclusion is that, given a fixed amount of are source that we are able to devote to global priorities, the longtermist’s favoured interventions have…
Existential Risk and Growth – Philip Trammell (Global Priorities Institute and Department of Economics, University of Oxford) and Leopold Aschenbrenner
Technologies may pose existential risks to civilization. Though accelerating technological development may increase the risk of anthropogenic existential catastrophe per period in the short run, two considerations suggest that a sector-neutral acceleration decreases the risk that such a catastrophe ever occurs. First, acceleration decreases the time spent at each technology level. Second, since a richer society is willing to sacrifice more for safety, optimal policy can yield an “existential risk Kuznets curve”; acceleration…
Numbers Tell, Words Sell – Michael Thaler (University College London), Mattie Toma (University of Warwick) and Victor Yaneng Wang (Massachusetts Institute of Technology)
When communicating numeric estimates with policymakers, journalists, or the general public, experts must choose between using numbers or natural language. We run two experiments to study whether experts strategically use language to communicate numeric estimates in order to persuade receivers. In Study 1, senders communicate probabilities of abstract events to receivers on Prolific, and in Study 2 academic researchers communicate the effect sizes in research papers to government policymakers. When…