Hannah Sassaman, Jennifer Lee, Jenessa Irvine, Shankar Narayan
{"title":"Creating community-based tech policy: case studies, lessons learned, and what technologists and communities can do together","authors":"Hannah Sassaman, Jennifer Lee, Jenessa Irvine, Shankar Narayan","doi":"10.1145/3351095.3375689","DOIUrl":null,"url":null,"abstract":"What are the core ways the field of data science can center community voice and power throughout all the processes involved in conceptualizing, creating, and disseminating technology?? What are the most possible and most urgent ways communities can shape the field of algorithmic decision-making to center community power in the next few years? This interactive workshop will highlight some of the following lessons learned through our combined experience engaging with communities challenging technology in Seattle and Philadelphia, cities in the United States. We will discuss the historical context of disproportionate impacts of technology on marginalized and vulnerable communities; case studies including criminal justice risk assessments, face surveillance technologies, and surveillance regulations; and work in small-group and break-out sessions to engage questions about when and where technologists hold power, serve as gatekeepers, and can work in accountable partnership with impacted communities. By the end of the session, we hope that participants will learn how to actively center diverse communities in creating technology by examining successes, challenges, and ongoing work in Seattle and Philadelphia, through the following lessons we have learned: • that communities, policy-makers, and technologists need to work intimately together to lift up each other's' goals • that communities need to gain data justice and data literacy to understand and independently audit how a system is impacting them • that scientific analyses of algorithmic bias are powerful but heard most clearly when lifted up by local community members and stakeholders in decisions where algorithms might be deployed • that anecdotal stories of harm are most impactful on decisionmakers when tied to rigorous scientific analysis and examples from other communities that amplify and ground those stories • that communities and community goals and standards are often not heard in conversations between data scientists and people who deploy algorithms, as well as in decision-makers' conversations about what policy should look like • and that we need to begin to craft what it means for those with the least power in conversations about algorithmic fairness - those judged by those tools - to have far more, or even the most power in the future of their design or implementation.","PeriodicalId":377829,"journal":{"name":"Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3351095.3375689","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
What are the core ways the field of data science can center community voice and power throughout all the processes involved in conceptualizing, creating, and disseminating technology?? What are the most possible and most urgent ways communities can shape the field of algorithmic decision-making to center community power in the next few years? This interactive workshop will highlight some of the following lessons learned through our combined experience engaging with communities challenging technology in Seattle and Philadelphia, cities in the United States. We will discuss the historical context of disproportionate impacts of technology on marginalized and vulnerable communities; case studies including criminal justice risk assessments, face surveillance technologies, and surveillance regulations; and work in small-group and break-out sessions to engage questions about when and where technologists hold power, serve as gatekeepers, and can work in accountable partnership with impacted communities. By the end of the session, we hope that participants will learn how to actively center diverse communities in creating technology by examining successes, challenges, and ongoing work in Seattle and Philadelphia, through the following lessons we have learned: • that communities, policy-makers, and technologists need to work intimately together to lift up each other's' goals • that communities need to gain data justice and data literacy to understand and independently audit how a system is impacting them • that scientific analyses of algorithmic bias are powerful but heard most clearly when lifted up by local community members and stakeholders in decisions where algorithms might be deployed • that anecdotal stories of harm are most impactful on decisionmakers when tied to rigorous scientific analysis and examples from other communities that amplify and ground those stories • that communities and community goals and standards are often not heard in conversations between data scientists and people who deploy algorithms, as well as in decision-makers' conversations about what policy should look like • and that we need to begin to craft what it means for those with the least power in conversations about algorithmic fairness - those judged by those tools - to have far more, or even the most power in the future of their design or implementation.