In defence of fanaticism
Hayden Wilkinson (Australian National University)
GPI Working Paper No. 4-2020, published in Ethics
Which is better: a guarantee of a modest amount of moral value, or a tiny probability of arbitrarily large value? To prefer the latter seems fanatical. But, as I argue, avoiding such fanaticism brings severe problems. To do so, we must (1) decline intuitively attractive trade-offs; (2) rank structurally identical pairs of lotteries inconsistently, or else admit absurd sensitivity to tiny probability differences;(3) have rankings depend on remote, unaffected events (including events in ancient Egypt); and often (4) neglect to rank lotteries as we already know we would if we learned more. Compared to these implications, fanaticism is highly plausible
Other working papers
Egyptology and Fanaticism – Hayden Wilkinson (Global Priorities Institute, University of Oxford)
Various decision theories share a troubling implication. They imply that, for any finite amount of value, it would be better to wager it all for a vanishingly small probability of some greater value. Counterintuitive as it might be, this fanaticism has seemingly compelling independent arguments in its favour. In this paper, I consider perhaps the most prima facie compelling such argument: an Egyptology argument (an analogue of the Egyptology argument from population ethics). …
Longtermism, aggregation, and catastrophic risk – Emma J. Curran (University of Cambridge)
Advocates of longtermism point out that interventions which focus on improving the prospects of people in the very far future will, in expectation, bring about a significant amount of good. Indeed, in expectation, such long-term interventions bring about far more good than their short-term counterparts. As such, longtermists claim we have compelling moral reason to prefer long-term interventions. …
Existential risk and growth – Leopold Aschenbrenner (Columbia University)
Human activity can create or mitigate risks of catastrophes, such as nuclear war, climate change, pandemics, or artificial intelligence run amok. These could even imperil the survival of human civilization. What is the relationship between economic growth and such existential risks? In a model of directed technical change, with moderate parameters, existential risk follows a Kuznets-style inverted U-shape. …