Longtermist institutional reform
Tyler M. John (Rutgers University) and William MacAskill (Global Priorities Institute, Oxford University)
GPI Working Paper No. 14-2020, published in The Long View: Essays on Policy, Philanthropy, and the Long-Term Future
There is a vast number of people who will live in the centuries and millennia to come. Even if homo sapiens survives merely as long as a typical species, we have hundreds of thousands of years ahead of us. And our future potential could be much greater than that again: it will be hundreds of millions of years until the Earth is sterilized by the expansion of the Sun, and many trillions of years before the last stars die out. In all probability, future generations will outnumber us by thousands or millions to one; of all the people who we might affect with our actions, the overwhelming majority are yet to come. [...]
Other working papers
The end of economic growth? Unintended consequences of a declining population – Charles I. Jones (Stanford University)
In many models, economic growth is driven by people discovering new ideas. These models typically assume either a constant or growing population. However, in high income countries today, fertility is already below its replacement rate: women are having fewer than two children on average. It is a distinct possibility — highlighted in the recent book, Empty Planet — that global population will decline rather than stabilize in the long run. …
Tiny probabilities and the value of the far future – Petra Kosonen (Population Wellbeing Initiative, University of Texas at Austin)
Morally speaking, what matters the most is the far future – at least according to Longtermism. The reason why the far future is of utmost importance is that our acts’ expected influence on the value of the world is mainly determined by their consequences in the far future. The case for Longtermism is straightforward: Given the enormous number of people who might exist in the far future, even a tiny probability of affecting how the far future goes outweighs the importance of our acts’ consequences…
Three mistakes in the moral mathematics of existential risk – David Thorstad (Global Priorities Institute, University of Oxford)
Longtermists have recently argued that it is overwhelmingly important to do what we can to mitigate existential risks to humanity. I consider three mistakes that are often made in calculating the value of existential risk mitigation: focusing on cumulative risk rather than period risk; ignoring background risk; and neglecting population dynamics. I show how correcting these mistakes pushes the value of existential risk mitigation substantially below leading estimates, potentially low enough to…