The case for strong longtermism
Hilary Greaves and William MacAskill (Global Priorities Institute, University of Oxford)
GPI Working Paper No. 5-2021
A striking fact about the history of civilisation is just how early we are in it. There are 5000 years of recorded history behind us, but how many years are still to come? If we merely last as long as the typical mammalian species, we still have over 200,000 years to go (Barnosky et al. 2011); there could be a further one billion years until the Earth is no longer habitable for humans (Wolf and Toon 2015); and trillions of years until the last conventional star formations (Adams and Laughlin 1999:34). Even on the most conservative of these timelines, we have progressed through a tiny fraction of history. If humanity’s saga were a novel, we would be on the very first page.
Other working papers
High risk, low reward: A challenge to the astronomical value of existential risk mitigation – David Thorstad (Global Priorities Institute, University of Oxford)
Many philosophers defend two claims: the astronomical value thesis that it is astronomically important to mitigate existential risks to humanity, and existential risk pessimism, the claim that humanity faces high levels of existential risk. It is natural to think that existential risk pessimism supports the astronomical value thesis. In this paper, I argue that precisely the opposite is true. Across a range of assumptions, existential risk pessimism significantly reduces the value of existential risk mitigation…
The long-run relationship between per capita incomes and population size – Maya Eden (University of Zurich) and Kevin Kuruc (Population Wellbeing Initiative, University of Texas at Austin)
The relationship between the human population size and per capita incomes has long been debated. Two competing forces feature prominently in these discussions. On the one hand, a larger population means that limited natural resources must be shared among more people. On the other hand, more people means more innovation and faster technological progress, other things equal. We study a model that features both of these channels. A calibration suggests that, in the long run, (marginal) increases in population would…
Dispelling the Anthropic Shadow – Teruji Thomas (Global Priorities Institute, University of Oxford)
There are some possible events that we could not possibly discover in our past. We could not discover an omnicidal catastrophe, an event so destructive that it permanently wiped out life on Earth. Had such a catastrophe occurred, we wouldn’t be here to find out. This space of unobservable histories has been called the anthropic shadow. Several authors claim that the anthropic shadow leads to an ‘observation selection bias’, analogous to survivorship bias, when we use the historical record to estimate catastrophic risks. …