Moral uncertainty and public justification
Jacob Barrett (Global Priorities Institute, University of Oxford) and Andreas T Schmidt (University of Groningen)
GPI Working Paper No. 15-2021, forthcoming at Philosophers' Imprint
Moral uncertainty and disagreement pervade our lives. Yet we still need to make decisions and act, both in individual and political contexts. So, what should we do? The moral uncertainty approach provides a theory of what individuals morally ought to do when they are uncertain about morality. Public reason liberals, in contrast, provide a theory of how societies should deal with reasonable disagreements about morality. They defend the public justification principle: state action is permissible only if it can be justified to all reasonable people. In this article, we bring these two approaches together. Specifically, we investigate whether the moral uncertainty approach supports public reason liberalism: given our own moral uncertainty, should we favor public justification? We argue that while the moral uncertainty approach cannot vindicate an exceptionless public justification principle, it gives us reason to adopt public justification as a pro tanto institutional commitment. Furthermore, it provides new answers to some intramural debates among public reason liberals and new responses to some common objections.
Other working papers
Estimating long-term treatment effects without long-term outcome data – David Rhys Bernard (Paris School of Economics)
Estimating long-term impacts of actions is important in many areas but the key difficulty is that long-term outcomes are only observed with a long delay. One alternative approach is to measure the effect on an intermediate outcome or a statistical surrogate and then use this to estimate the long-term effect. …
Maximal cluelessness – Andreas Mogensen (Global Priorities Institute, Oxford University)
I argue that many of the priority rankings that have been proposed by effective altruists seem to be in tension with apparently reasonable assumptions about the rational pursuit of our aims in the face of uncertainty. The particular issue on which I focus arises from recognition of the overwhelming importance…
Tiny probabilities and the value of the far future – Petra Kosonen (Population Wellbeing Initiative, University of Texas at Austin)
Morally speaking, what matters the most is the far future – at least according to Longtermism. The reason why the far future is of utmost importance is that our acts’ expected influence on the value of the world is mainly determined by their consequences in the far future. The case for Longtermism is straightforward: Given the enormous number of people who might exist in the far future, even a tiny probability of affecting how the far future goes outweighs the importance of our acts’ consequences…