How should risk and ambiguity affect our charitable giving?

Lara Buchak (Princeton University)

GPI Working Paper No. 8 - 2022, published in Utilitas

Suppose we want to do the most good we can with a particular sum of money, but we cannot be certain of the consequences of different ways of making use of it. This paper explores how our attitudes towards risk and ambiguity bear on what we should do. It shows that risk-avoidance and ambiguity-aversion can each provide good reason to divide our money between various charitable organizations rather than to give it all to the most promising one. It also shows on how different attitudes towards risk and ambiguity affect whether we should give to an organization which does a small amount of good for certain or to one which does a large amount of good with some small, unknown probability.

Other working papers

Meaning, medicine and merit – Andreas Mogensen (Global Priorities Institute, Oxford University)

Given the inevitability of scarcity, should public institutions ration healthcare resources so as to prioritize those who contribute more to society? Intuitively, we may feel that this would be somehow inegalitarian. I argue that the egalitarian objection to prioritizing treatment on the basis of patients’ usefulness to others is best thought…

A paradox for tiny probabilities and enormous values – Nick Beckstead (Open Philanthropy Project) and Teruji Thomas (Global Priorities Institute, Oxford University)

We show that every theory of the value of uncertain prospects must have one of three unpalatable properties. Reckless theories recommend risking arbitrarily great gains at arbitrarily long odds for the sake of enormous potential; timid theories recommend passing up arbitrarily great gains to prevent a tiny increase in risk; nontransitive theories deny the principle that, if A is better than B and B is better than C, then A must be better than C.

Tiny probabilities and the value of the far future – Petra Kosonen (Population Wellbeing Initiative, University of Texas at Austin)

Morally speaking, what matters the most is the far future – at least according to Longtermism. The reason why the far future is of utmost importance is that our acts’ expected influence on the value of the world is mainly determined by their consequences in the far future. The case for Longtermism is straightforward: Given the enormous number of people who might exist in the far future, even a tiny probability of affecting how the far future goes outweighs the importance of our acts’ consequences…