How cost-effective are efforts to detect near-Earth-objects?

Toby Newberry (Future of Humanity Institute, University of Oxford)

GPI Technical Report No. T1-2021

Near-Earth-objects (NEOs) include asteroids and comets with orbits that bring them into close proximity with Earth. NEOs are well-known to have impacted Earth in the past, sometimes to catastrophic effect.2 Over the past few decades, humanity has taken steps to detect any NEOs on impact trajectories, and, in doing so, we have significantly improved our estimate of the risk that an impact will occur over the next century. This report estimates the cost-effectiveness of such detection efforts. The remainder of this section sets out the context of the report...

Other working papers

Estimating long-term treatment effects without long-term outcome data – David Rhys Bernard (Rethink Priorities), Jojo Lee and Victor Yaneng Wang (Global Priorities Institute, University of Oxford)

The surrogate index method allows policymakers to estimate long-run treatment effects before long-run outcomes are observable. We meta-analyse this approach over nine long-run RCTs in development economics, comparing surrogate estimates to estimates from actual long-run RCT outcomes. We introduce the M-lasso algorithm for constructing the surrogate approach’s first-stage predictive model and compare its performance with other surrogate estimation methods. …

The long-run relationship between per capita incomes and population size – Maya Eden (University of Zurich) and Kevin Kuruc (Population Wellbeing Initiative, University of Texas at Austin)

The relationship between the human population size and per capita incomes has long been debated. Two competing forces feature prominently in these discussions. On the one hand, a larger population means that limited natural resources must be shared among more people. On the other hand, more people means more innovation and faster technological progress, other things equal. We study a model that features both of these channels. A calibration suggests that, in the long run, (marginal) increases in population would…

AI takeover and human disempowerment – Adam Bales (Global Priorities Institute, University of Oxford)

Some take seriously the possibility of AI takeover, where AI systems seize power in a way that leads to human disempowerment. Assessing the likelihood of takeover requires answering empirical questions about the future of AI technologies and the context in which AI will operate. In many cases, philosophers are poorly placed to answer these questions. However, some prior questions are more amenable to philosophical techniques. What does it mean to speak of AI empowerment and human disempowerment? …