{"title":"长期主义者是否应该建议加快灭绝而不是推迟灭绝?","authors":"Richard Pettigrew","doi":"10.1093/monist/onae003","DOIUrl":null,"url":null,"abstract":"\n Longtermists argue we should devote much of our resources to raising the probability of a long happy future for sentient beings. But most interventions that raise that probability also raise the probability of a long miserable future, even if they raise the latter by a smaller amount. If we choose by maximising expected utility, this isn’t a problem; but, if we use a risk-averse decision rule, it is. I show that, with the same probabilities and utilities, a risk-averse decision theory tells us to hasten human extinction, not delay it. What’s more, I argue that morality requires us to use a risk-averse decision theory. I present this not as an argument for hastening extinction, but as a challenge to longtermism.","PeriodicalId":516548,"journal":{"name":"The Monist","volume":"9 8","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Should Longtermists Recommend Hastening Extinction Rather Than Delaying It?\",\"authors\":\"Richard Pettigrew\",\"doi\":\"10.1093/monist/onae003\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Longtermists argue we should devote much of our resources to raising the probability of a long happy future for sentient beings. But most interventions that raise that probability also raise the probability of a long miserable future, even if they raise the latter by a smaller amount. If we choose by maximising expected utility, this isn’t a problem; but, if we use a risk-averse decision rule, it is. I show that, with the same probabilities and utilities, a risk-averse decision theory tells us to hasten human extinction, not delay it. What’s more, I argue that morality requires us to use a risk-averse decision theory. I present this not as an argument for hastening extinction, but as a challenge to longtermism.\",\"PeriodicalId\":516548,\"journal\":{\"name\":\"The Monist\",\"volume\":\"9 8\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Monist\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1093/monist/onae003\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Monist","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/monist/onae003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Should Longtermists Recommend Hastening Extinction Rather Than Delaying It?
Longtermists argue we should devote much of our resources to raising the probability of a long happy future for sentient beings. But most interventions that raise that probability also raise the probability of a long miserable future, even if they raise the latter by a smaller amount. If we choose by maximising expected utility, this isn’t a problem; but, if we use a risk-averse decision rule, it is. I show that, with the same probabilities and utilities, a risk-averse decision theory tells us to hasten human extinction, not delay it. What’s more, I argue that morality requires us to use a risk-averse decision theory. I present this not as an argument for hastening extinction, but as a challenge to longtermism.