Existential Risk and Growth
Leopold Aschenbrenner and Philip Trammell (Global Priorities Institute and Department of Economics, University of Oxford)
GPI Working Paper No. 13-2024
Technology increases consumption but can create or mitigate existential risk to human civilization. Though accelerating technological development may increase the hazard rate (the risk of existential catastrophe per period) in the short run, two considerations suggest that acceleration typically decreases the risk that such a catastrophe ever occurs. First, acceleration decreases the time spent at each technology level. Second, given a policy option to sacrifice consumption for safety, acceleration motivates greater sacrifices by decreasing the marginal utility of consumption and increasing the value of the future. Under broad conditions, optimal policy thus produces an “existential risk Kuznets curve”, in which the hazard rate rises and then falls with the technology level and acceleration pulls forward a future in which risk is low. The negative impacts of acceleration on risk are offset only given policy failures, or direct contributions of acceleration to cumulative risk, that are sufficiently extreme.
An earlier version of the paper was published as GPI Working Paper No. 6-2020, and is available here.
Other working papers
Minimal and Expansive Longtermism – Hilary Greaves (University of Oxford) and Christian Tarsney (Population Wellbeing Initiative, University of Texas at Austin)
The standard case for longtermism focuses on a small set of risks to the far future, and argues that in a small set of choice situations, the present marginal value of mitigating those risks is very great. But many longtermists are attracted to, and many critics of longtermism worried by, a farther-reaching form of longtermism. According to this farther-reaching form, there are many ways of improving the far future, which determine the value of our options in all or nearly all choice situations…
Non-additive axiologies in large worlds – Christian Tarsney and Teruji Thomas (Global Priorities Institute, Oxford University)
Is the overall value of a world just the sum of values contributed by each value-bearing entity in that world? Additively separable axiologies (like total utilitarianism, prioritarianism, and critical level views) say ‘yes’, but non-additive axiologies (like average utilitarianism, rank-discounted utilitarianism, and variable value views) say ‘no’…
Meaning, medicine and merit – Andreas Mogensen (Global Priorities Institute, Oxford University)
Given the inevitability of scarcity, should public institutions ration healthcare resources so as to prioritize those who contribute more to society? Intuitively, we may feel that this would be somehow inegalitarian. I argue that the egalitarian objection to prioritizing treatment on the basis of patients’ usefulness to others is best thought…