How cost-effective are efforts to detect near-Earth-objects?
Toby Newberry (Future of Humanity Institute, University of Oxford)
GPI Technical Report No. T1-2021
Near-Earth-objects (NEOs) include asteroids and comets with orbits that bring them into close proximity with Earth. NEOs are well-known to have impacted Earth in the past, sometimes to catastrophic effect.2 Over the past few decades, humanity has taken steps to detect any NEOs on impact trajectories, and, in doing so, we have significantly improved our estimate of the risk that an impact will occur over the next century. This report estimates the cost-effectiveness of such detection efforts. The remainder of this section sets out the context of the report...
Other working papers
Moral uncertainty and public justification – Jacob Barrett (Global Priorities Institute, University of Oxford) and Andreas T Schmidt (University of Groningen)
Moral uncertainty and disagreement pervade our lives. Yet we still need to make decisions and act, both in individual and political contexts. So, what should we do? The moral uncertainty approach provides a theory of what individuals morally ought to do when they are uncertain about morality…
In search of a biological crux for AI consciousness – Bradford Saad (Global Priorities Institute, University of Oxford)
Whether AI systems could be conscious is often thought to turn on whether consciousness is closely linked to biology. The rough thought is that if consciousness is closely linked to biology, then AI consciousness is impossible, and if consciousness is not closely linked to biology, then AI consciousness is possible—or, at any rate, it’s more likely to be possible. A clearer specification of the kind of link between consciousness and biology that is crucial for the possibility of AI consciousness would help organize inquiry into…
Should longtermists recommend hastening extinction rather than delaying it? – Richard Pettigrew (University of Bristol)
Longtermism is the view that the most urgent global priorities, and those to which we should devote the largest portion of our current resources, are those that focus on ensuring a long future for humanity, and perhaps sentient or intelligent life more generally, and improving the quality of those lives in that long future. The central argument for this conclusion is that, given a fixed amount of are source that we are able to devote to global priorities, the longtermist’s favoured interventions have…