{"title":"Race and Computer Vision","authors":"A. Monea","doi":"10.33767/osf.io/xza9q","DOIUrl":null,"url":null,"abstract":"-- Forthcoming in Andreas Sudmann (ed.) The Democratization of AI. Berlin, Germany: Transcript. -- This article examines how attempts to make computer vision systems accessible to users with darker skin tones has led to either the hypervisibility of phenotypic racial traits, particularly morphological features like hair texture and lip size, or the invisibility of race. Drawing on critical race theory and the problematic history of racial representation in photographic media, he demonstrates how racial biases are prevalent in the visual datasets that many contemporary computer vision algorithms are trained on, essentially hardcoding these biases into our computer vision technologies, like Google Photos. The most frequent industry reaction to these hardcoded racial biases is to render race invisible in the system, as was done with Google Photos. He further shows how the invisibility of race in computer vision leads to the familiar problems of ‘color blindness,’ only expressed in new media. The author argues that these constitute fundamental problems for the potential democratization of AI and outlines some concrete steps that we might take to more strongly demand egalitarian computer vision systems.","PeriodicalId":151185,"journal":{"name":"The Democratization of Artificial Intelligence","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Democratization of Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33767/osf.io/xza9q","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
-- Forthcoming in Andreas Sudmann (ed.) The Democratization of AI. Berlin, Germany: Transcript. -- This article examines how attempts to make computer vision systems accessible to users with darker skin tones has led to either the hypervisibility of phenotypic racial traits, particularly morphological features like hair texture and lip size, or the invisibility of race. Drawing on critical race theory and the problematic history of racial representation in photographic media, he demonstrates how racial biases are prevalent in the visual datasets that many contemporary computer vision algorithms are trained on, essentially hardcoding these biases into our computer vision technologies, like Google Photos. The most frequent industry reaction to these hardcoded racial biases is to render race invisible in the system, as was done with Google Photos. He further shows how the invisibility of race in computer vision leads to the familiar problems of ‘color blindness,’ only expressed in new media. The author argues that these constitute fundamental problems for the potential democratization of AI and outlines some concrete steps that we might take to more strongly demand egalitarian computer vision systems.