Calibration dilemmas in the ethics of distribution

Jacob M. Nebel (University of Southern California) and H. Orri Stefánsson (Stockholm University and Swedish Collegium for Advanced Study)

GPI Working Paper No. 10-2021, published in Economics & Philosophy

This paper was the basis for the Parfit Memorial Lecture 2021.
The recording of the Parfit Memorial Lecture is now available to view here.
This paper presents a new kind of problem in the ethics of distribution. The problem takes the form of several “calibration dilemmas,” in which intuitively reasonable aversion to small-stakes inequalities requires leading theories of distribution to recommend intuitively unreasonable aversion to large-stakes inequalities—e.g., inequalities in which half the population would gain an arbitrarily large quantity of well-being or resources. We first lay out a series of such dilemmas for a family of broadly prioritarian theories. We then consider a widely endorsed family of egalitarian views and show that, despite avoiding the dilemmas for prioritarianism, they are subject to even more forceful calibration dilemmas. We then show how our results challenge common utilitarian accounts of the badness of inequalities in resources (e.g., wealth inequality). These dilemmas leave us with a few options, all of which we find unpalatable. We conclude by laying out these options and suggesting avenues for further research.

Other working papers

A Fission Problem for Person-Affecting Views – Elliott Thornley (Global Priorities Institute, University of Oxford)

On person-affecting views in population ethics, the moral import of a person’s welfare depends on that person’s temporal or modal status. These views typically imply that – all else equal – we’re never required to create extra people, or to act in ways that increase the probability of extra people coming into existence. In this paper, I use Parfit-style fission cases to construct a dilemma for person-affecting views: either they forfeit their seeming-advantages and face fission analogues…