Doomsday rings twice
Andreas Mogensen (Global Priorities Institute, Oxford University)
GPI Working Paper No. 1-2019
This paper considers the argument according to which, because we should regard it as a priori very unlikely that we are among the most important people who will ever exist, we should increase our confidence that the human species will not persist beyond the current historical era, which seems to represent a crucial juncture in human history and perhaps even the history of life on earth. The argument is a descendant of the Carter-Leslie Doomsday Argument, but I show that it does not inherit the crucial flaw in its immediate ancestor. Nonetheless, we are not forced to follow the argument where it leads if we instead significantly decrease our confidence that we can affect the long run future of humanity.
Other working papers
How should risk and ambiguity affect our charitable giving? – Lara Buchak (Princeton University)
Suppose we want to do the most good we can with a particular sum of money, but we cannot be certain of the consequences of different ways of making use of it. This paper explores how our attitudes towards risk and ambiguity bear on what we should do. It shows that risk-avoidance and ambiguity-aversion can each provide good reason to divide our money between various charitable organizations rather than to give it all to the most promising one…
Are we living at the hinge of history? – William MacAskill (Global Priorities Institute, Oxford University)
In the final pages of On What Matters, Volume II, Derek Parfit comments: ‘We live during the hinge of history… If we act wisely in the next few centuries, humanity will survive its most dangerous and decisive period… What now matters most is that we avoid ending human history.’ This passage echoes Parfit’s comment, in Reasons and Persons, that ‘the next few centuries will be the most important in human history’. …
Crying wolf: Warning about societal risks can be reputationally risky – Lucius Caviola (Global Priorities Institute, University of Oxford) et al.
Society relies on expert warnings about large-scale risks like pandemics and natural disasters. Across ten studies (N = 5,342), we demonstrate people’s reluctance to warn about unlikely but large-scale risks because they are concerned about being blamed for being wrong. In particular, warners anticipate that if the risk doesn’t occur, they will be perceived as overly alarmist and responsible for wasting societal resources. This phenomenon appears in the context of natural, technological, and financial risks…