种族与计算机视觉

A. Monea
{"title":"种族与计算机视觉","authors":"A. Monea","doi":"10.33767/osf.io/xza9q","DOIUrl":null,"url":null,"abstract":"-- Forthcoming in Andreas Sudmann (ed.) The Democratization of AI. Berlin, Germany: Transcript. -- This article examines how attempts to make computer vision systems accessible to users with darker skin tones has led to either the hypervisibility of phenotypic racial traits, particularly morphological features like hair texture and lip size, or the invisibility of race. Drawing on critical race theory and the problematic history of racial representation in photographic media, he demonstrates how racial biases are prevalent in the visual datasets that many contemporary computer vision algorithms are trained on, essentially hardcoding these biases into our computer vision technologies, like Google Photos. The most frequent industry reaction to these hardcoded racial biases is to render race invisible in the system, as was done with Google Photos. He further shows how the invisibility of race in computer vision leads to the familiar problems of ‘color blindness,’ only expressed in new media. The author argues that these constitute fundamental problems for the potential democratization of AI and outlines some concrete steps that we might take to more strongly demand egalitarian computer vision systems.","PeriodicalId":151185,"journal":{"name":"The Democratization of Artificial Intelligence","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Race and Computer Vision\",\"authors\":\"A. Monea\",\"doi\":\"10.33767/osf.io/xza9q\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"-- Forthcoming in Andreas Sudmann (ed.) The Democratization of AI. Berlin, Germany: Transcript. -- This article examines how attempts to make computer vision systems accessible to users with darker skin tones has led to either the hypervisibility of phenotypic racial traits, particularly morphological features like hair texture and lip size, or the invisibility of race. Drawing on critical race theory and the problematic history of racial representation in photographic media, he demonstrates how racial biases are prevalent in the visual datasets that many contemporary computer vision algorithms are trained on, essentially hardcoding these biases into our computer vision technologies, like Google Photos. The most frequent industry reaction to these hardcoded racial biases is to render race invisible in the system, as was done with Google Photos. He further shows how the invisibility of race in computer vision leads to the familiar problems of ‘color blindness,’ only expressed in new media. The author argues that these constitute fundamental problems for the potential democratization of AI and outlines some concrete steps that we might take to more strongly demand egalitarian computer vision systems.\",\"PeriodicalId\":151185,\"journal\":{\"name\":\"The Democratization of Artificial Intelligence\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Democratization of Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.33767/osf.io/xza9q\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Democratization of Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33767/osf.io/xza9q","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

——即将出版的Andreas Sudmann(主编)《人工智能的民主化》。柏林,德国:文字记录。这篇文章探讨了如何尝试使计算机视觉系统对肤色较深的用户开放,从而导致表型种族特征的高度可见性,特别是像头发质地和嘴唇大小这样的形态学特征,或者种族的不可见性。利用批判性的种族理论和摄影媒体中种族表现的问题历史,他展示了种族偏见在视觉数据集中是如何普遍存在的,许多当代计算机视觉算法都是在视觉数据集中训练的,本质上是将这些偏见硬编码到我们的计算机视觉技术中,比如谷歌照片。对于这些根深蒂固的种族偏见,业界最常见的反应是让种族在系统中消失,就像谷歌照片(Google Photos)所做的那样。他进一步展示了种族在计算机视觉中的不可见性是如何导致人们熟悉的“色盲”问题的,这种问题只在新媒体中表现出来。作者认为,这些构成了人工智能潜在民主化的根本问题,并概述了我们可能采取的一些具体步骤,以更强烈地要求平等主义的计算机视觉系统。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Race and Computer Vision
-- Forthcoming in Andreas Sudmann (ed.) The Democratization of AI. Berlin, Germany: Transcript. -- This article examines how attempts to make computer vision systems accessible to users with darker skin tones has led to either the hypervisibility of phenotypic racial traits, particularly morphological features like hair texture and lip size, or the invisibility of race. Drawing on critical race theory and the problematic history of racial representation in photographic media, he demonstrates how racial biases are prevalent in the visual datasets that many contemporary computer vision algorithms are trained on, essentially hardcoding these biases into our computer vision technologies, like Google Photos. The most frequent industry reaction to these hardcoded racial biases is to render race invisible in the system, as was done with Google Photos. He further shows how the invisibility of race in computer vision leads to the familiar problems of ‘color blindness,’ only expressed in new media. The author argues that these constitute fundamental problems for the potential democratization of AI and outlines some concrete steps that we might take to more strongly demand egalitarian computer vision systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信