The epistemic challenge to longtermism
Christian Tarsney (Global Priorities Institute, University of Oxford)
GPI Working Paper No. 3-2022, published in Synthese
Longtermists claim that what we ought to do is mainly determined by how our actions might affect the very long-run future. A natural objection to longtermism is that these effects may be nearly impossible to predict— perhaps so close to impossible that, despite the astronomical importance of the far future, the expected value of our present actions is mainly determined by near-term considerations. This paper aims to precisify and evaluate one version of this epistemic objection to longtermism. To that end, I develop two simple models for comparing ‘longtermist’ and ‘neartermist’ interventions, incorporating the idea that it is harder to make a predictable difference to the further future. These models yield mixed conclusions: if we simply aim to maximize expected value, and don’t mind premising our choices on minuscule probabilities of astronomical payoffs, the case for longtermism looks robust. But on some prima facie plausible empirical worldviews, the expectational superiority of longtermist interventions depends heavily on these ‘Pascalian’ probabilities. So the case for longtermism may depend either on plausible but non-obvious empirical claims or on a tolerance for Pascalian fanaticism.
Other working papers
Staking our future: deontic long-termism and the non-identity problem – Andreas Mogensen (Global Priorities Institute, Oxford University)
Greaves and MacAskill argue for axiological longtermism, according to which, in a wide class of decision contexts, the option that is ex ante best is the option that corresponds to the best lottery over histories from t onwards, where t is some date far in the future. They suggest that a stakes-sensitivity argument…
Prediction: The long and the short of it – Antony Millner (University of California, Santa Barbara) and Daniel Heyen (ETH Zurich)
Commentators often lament forecasters’ inability to provide precise predictions of the long-run behaviour of complex economic and physical systems. Yet their concerns often conflate the presence of substantial long-run uncertainty with the need for long-run predictability; short-run predictions can partially substitute for long-run predictions if decision-makers can adjust their activities over time. …
Dispelling the Anthropic Shadow – Teruji Thomas (Global Priorities Institute, University of Oxford)
There are some possible events that we could not possibly discover in our past. We could not discover an omnicidal catastrophe, an event so destructive that it permanently wiped out life on Earth. Had such a catastrophe occurred, we wouldn’t be here to find out. This space of unobservable histories has been called the anthropic shadow. Several authors claim that the anthropic shadow leads to an ‘observation selection bias’, analogous to survivorship bias, when we use the historical record to estimate catastrophic risks. …
- « Previous
- 1
- …
- 32
- 33
- 34