Tiny probabilities and the value of the far future
Petra Kosonen (Population Wellbeing Initiative, University of Texas at Austin)
GPI Working Paper No. 1-2023
Morally speaking, what matters the most is the far future - at least according to Longtermism. The reason why the far future is of utmost importance is that our acts' expected influence on the value of the world is mainly determined by their consequences in the far future. The case for Longtermism is straightforward: Given the enormous number of people who might exist in the far future, even a tiny probability of affecting how the far future goes outweighs the importance of our acts' consequences in the near term. However, there seems to be something wrong with a theory that lets very small probabilities of huge payoffs dictate one's own course of action. If, instead, we discount very small probabilities to zero, we may have a response to Longtermism provided that its truth depends on tiny probabilities of vast value. Contrary to this, I will argue that discounting small probabilities does not undermine Longtermism.
Other working papers
Longtermism, aggregation, and catastrophic risk – Emma J. Curran (University of Cambridge)
Advocates of longtermism point out that interventions which focus on improving the prospects of people in the very far future will, in expectation, bring about a significant amount of good. Indeed, in expectation, such long-term interventions bring about far more good than their short-term counterparts. As such, longtermists claim we have compelling moral reason to prefer long-term interventions. …
Doomsday and objective chance – Teruji Thomas (Global Priorities Institute, Oxford University)
Lewis’s Principal Principle says that one should usually align one’s credences with the known chances. In this paper I develop a version of the Principal Principle that deals well with some exceptional cases related to the distinction between metaphysical and epistemic modality. I explain how this principle gives a unified account of the Sleeping Beauty problem and chance-based principles of anthropic reasoning…
High risk, low reward: A challenge to the astronomical value of existential risk mitigation – David Thorstad (Global Priorities Institute, University of Oxford)
Many philosophers defend two claims: the astronomical value thesis that it is astronomically important to mitigate existential risks to humanity, and existential risk pessimism, the claim that humanity faces high levels of existential risk. It is natural to think that existential risk pessimism supports the astronomical value thesis. In this paper, I argue that precisely the opposite is true. Across a range of assumptions, existential risk pessimism significantly reduces the value of existential risk mitigation…