Stephen C. Rea, Hanzelle Kleeman, Qin Zhu, Benjamin Gilbert, Chuan Yue
{"title":"Crowdsourcing as a Tool for Research: Methodological, Fair, and Political Considerations","authors":"Stephen C. Rea, Hanzelle Kleeman, Qin Zhu, Benjamin Gilbert, Chuan Yue","doi":"10.1177/02704676211003808","DOIUrl":null,"url":null,"abstract":"Crowdsourcing platforms are powerful tools for academic researchers. Proponents claim that crowdsourcing helps researchers quickly and affordably recruit enough human subjects with diverse backgrounds to generate significant statistical power, while critics raise concerns about unreliable data quality, labor exploitation, and unequal power dynamics between researchers and workers. We examine these concerns along three dimensions: methods, fairness, and politics. We find that researchers offer vastly different compensation rates for crowdsourced tasks, and address potential concerns about data validity by using platform-specific tools and user verification methods. Additionally, workers depend upon crowdsourcing platforms for a significant portion of their income, are motivated more by fear of losing access to work than by specific compensation rates, and are frustrated by a lack of transparency and occasional unfair treatment from job requesters. Finally, we discuss critical computing scholars’ proposals to address crowdsourcing’s problems, challenges with implementing these resolutions, and potential avenues for future research.","PeriodicalId":38848,"journal":{"name":"Bulletin of Science, Technology and Society","volume":"40 1","pages":"40 - 53"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/02704676211003808","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bulletin of Science, Technology and Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/02704676211003808","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Arts and Humanities","Score":null,"Total":0}
引用次数: 5
Abstract
Crowdsourcing platforms are powerful tools for academic researchers. Proponents claim that crowdsourcing helps researchers quickly and affordably recruit enough human subjects with diverse backgrounds to generate significant statistical power, while critics raise concerns about unreliable data quality, labor exploitation, and unequal power dynamics between researchers and workers. We examine these concerns along three dimensions: methods, fairness, and politics. We find that researchers offer vastly different compensation rates for crowdsourced tasks, and address potential concerns about data validity by using platform-specific tools and user verification methods. Additionally, workers depend upon crowdsourcing platforms for a significant portion of their income, are motivated more by fear of losing access to work than by specific compensation rates, and are frustrated by a lack of transparency and occasional unfair treatment from job requesters. Finally, we discuss critical computing scholars’ proposals to address crowdsourcing’s problems, challenges with implementing these resolutions, and potential avenues for future research.