图像识别系统中的性别偏见诊断。

Carsten Schwemmer, Carly Knight, Emily D Bello-Pardo, Stan Oklobdzija, Martijn Schoonvelde, Jeffrey W Lockhart
{"title":"图像识别系统中的性别偏见诊断。","authors":"Carsten Schwemmer,&nbsp;Carly Knight,&nbsp;Emily D Bello-Pardo,&nbsp;Stan Oklobdzija,&nbsp;Martijn Schoonvelde,&nbsp;Jeffrey W Lockhart","doi":"10.1177/2378023120967171","DOIUrl":null,"url":null,"abstract":"<p><p>Image recognition systems offer the promise to learn from images at scale without requiring expert knowledge. However, past research suggests that machine learning systems often produce biased output. In this article, we evaluate potential gender biases of commercial image recognition platforms using photographs of U.S. members of Congress and a large number of Twitter images posted by these politicians. Our crowdsourced validation shows that commercial image recognition systems can produce labels that are correct and biased at the same time as they selectively report a subset of many possible true labels. We find that images of women received three times more annotations related to physical appearance. Moreover, women in images are recognized at substantially lower rates in comparison with men. We discuss how encoded biases such as these affect the visibility of women, reinforce harmful gender stereotypes, and limit the validity of the insights that can be gathered from such data.</p>","PeriodicalId":513351,"journal":{"name":"Socius: Sociological Research for a Dynamic World","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1177/2378023120967171","citationCount":"42","resultStr":"{\"title\":\"Diagnosing Gender Bias in Image Recognition Systems.\",\"authors\":\"Carsten Schwemmer,&nbsp;Carly Knight,&nbsp;Emily D Bello-Pardo,&nbsp;Stan Oklobdzija,&nbsp;Martijn Schoonvelde,&nbsp;Jeffrey W Lockhart\",\"doi\":\"10.1177/2378023120967171\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Image recognition systems offer the promise to learn from images at scale without requiring expert knowledge. However, past research suggests that machine learning systems often produce biased output. In this article, we evaluate potential gender biases of commercial image recognition platforms using photographs of U.S. members of Congress and a large number of Twitter images posted by these politicians. Our crowdsourced validation shows that commercial image recognition systems can produce labels that are correct and biased at the same time as they selectively report a subset of many possible true labels. We find that images of women received three times more annotations related to physical appearance. Moreover, women in images are recognized at substantially lower rates in comparison with men. We discuss how encoded biases such as these affect the visibility of women, reinforce harmful gender stereotypes, and limit the validity of the insights that can be gathered from such data.</p>\",\"PeriodicalId\":513351,\"journal\":{\"name\":\"Socius: Sociological Research for a Dynamic World\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1177/2378023120967171\",\"citationCount\":\"42\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Socius: Sociological Research for a Dynamic World\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/2378023120967171\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2020/11/11 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Socius: Sociological Research for a Dynamic World","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/2378023120967171","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2020/11/11 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 42

摘要

图像识别系统有望在不需要专业知识的情况下大规模地从图像中学习。然而,过去的研究表明,机器学习系统经常产生有偏见的输出。在本文中,我们使用美国国会议员的照片和这些政治家发布的大量Twitter图像来评估商业图像识别平台的潜在性别偏见。我们的众包验证表明,商业图像识别系统可以产生正确和有偏见的标签,同时它们有选择地报告许多可能的真实标签的子集。我们发现,女性图像收到的与外表相关的注释是女性图像的三倍。此外,与男性相比,女性在图像中的识别率要低得多。我们讨论了诸如此类的编码偏见如何影响女性的知名度,强化有害的性别刻板印象,并限制了从这些数据中收集到的见解的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Diagnosing Gender Bias in Image Recognition Systems.

Diagnosing Gender Bias in Image Recognition Systems.

Diagnosing Gender Bias in Image Recognition Systems.

Diagnosing Gender Bias in Image Recognition Systems.

Image recognition systems offer the promise to learn from images at scale without requiring expert knowledge. However, past research suggests that machine learning systems often produce biased output. In this article, we evaluate potential gender biases of commercial image recognition platforms using photographs of U.S. members of Congress and a large number of Twitter images posted by these politicians. Our crowdsourced validation shows that commercial image recognition systems can produce labels that are correct and biased at the same time as they selectively report a subset of many possible true labels. We find that images of women received three times more annotations related to physical appearance. Moreover, women in images are recognized at substantially lower rates in comparison with men. We discuss how encoded biases such as these affect the visibility of women, reinforce harmful gender stereotypes, and limit the validity of the insights that can be gathered from such data.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信