Amanda Kube, Sanmay Das, P. Fowler, Yevgeniy Vorobeychik
{"title":"Just Resource Allocation? How Algorithmic Predictions and Human Notions of Justice Interact","authors":"Amanda Kube, Sanmay Das, P. Fowler, Yevgeniy Vorobeychik","doi":"10.1145/3490486.3538305","DOIUrl":null,"url":null,"abstract":"We examine justice in data-aided decisions in the context of a scarce societal resource allocation problem. Non-experts (recruited on Amazon Mechanical Turk) have to determine which homeless households to serve with limited housing assistance. We empirically elicit decision-maker preferences for whether to prioritize more vulnerable households or households who would best take advantage of more intensive interventions. We present three main findings. (1) When vulnerability or outcomes are quantitatively conceptualized and presented, humans (at a single point in time) are remarkably consistent in making either vulnerability- or outcome-oriented decisions. (2) Prior exposure to quantitative outcome predictions has a significant effect and changes the preferences of human decision-makers from vulnerability-oriented to outcome-oriented about one-third of the time. (3) Presenting algorithmically-derived risk predictions in addition to household descriptions reinforces decision-maker preferences. Among the vulnerability-oriented, presenting the risk predictions leads to a significant increase in allocations to the more vulnerable household, whereas among the outcome-oriented it leads to a significant decrease in allocations to the more vulnerable household. These findings emphasize the importance of explicitly aligning data-driven decision aids with system-wide allocation goals.","PeriodicalId":209859,"journal":{"name":"Proceedings of the 23rd ACM Conference on Economics and Computation","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 23rd ACM Conference on Economics and Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3490486.3538305","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
We examine justice in data-aided decisions in the context of a scarce societal resource allocation problem. Non-experts (recruited on Amazon Mechanical Turk) have to determine which homeless households to serve with limited housing assistance. We empirically elicit decision-maker preferences for whether to prioritize more vulnerable households or households who would best take advantage of more intensive interventions. We present three main findings. (1) When vulnerability or outcomes are quantitatively conceptualized and presented, humans (at a single point in time) are remarkably consistent in making either vulnerability- or outcome-oriented decisions. (2) Prior exposure to quantitative outcome predictions has a significant effect and changes the preferences of human decision-makers from vulnerability-oriented to outcome-oriented about one-third of the time. (3) Presenting algorithmically-derived risk predictions in addition to household descriptions reinforces decision-maker preferences. Among the vulnerability-oriented, presenting the risk predictions leads to a significant increase in allocations to the more vulnerable household, whereas among the outcome-oriented it leads to a significant decrease in allocations to the more vulnerable household. These findings emphasize the importance of explicitly aligning data-driven decision aids with system-wide allocation goals.