人工智能中的性别偏见

Q2 Social Sciences
Rosileine Mendonça de Lima, Barbara Pisker, V. S. Corrêa
{"title":"人工智能中的性别偏见","authors":"Rosileine Mendonça de Lima, Barbara Pisker, V. S. Corrêa","doi":"10.18080/jtde.v11n2.690","DOIUrl":null,"url":null,"abstract":"This study presents a Systematic Literature Review (SLR) of Gender Bias in Artificial Intelligence (AI). The research was conducted using two techniques: a domain-based approach to SLR process providing a bibliometric sample description and in-depth examination of the thematic categories arising from inductive categorization, extracted from reading and interpretation of the final 35 sample articles analyzed. In answering three key research questions on the types, causes, and overcoming (mitigating) strategies of gender bias in artificial intelligence, three thematic treemaps were constructed, enabling systematic overview as an essential contribution to the literature. The main types of gender bias found in AI are categorized as societal, technical, and individual. Societal and socio-technical aspects stand out as the leading causes of bias, while debiasing, dataset design and gender sensitivity were the most frequent among the main strategies for overcoming bias. The study also proposes theoretical, practical and managerial capacity building and policy implications that aim to influence broad socio-technical challenges and refer to changes necessary, aiming to create bias-free artificial intelligence.","PeriodicalId":37752,"journal":{"name":"Australian Journal of Telecommunications and the Digital Economy","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Gender Bias in Artificial Intelligence\",\"authors\":\"Rosileine Mendonça de Lima, Barbara Pisker, V. S. Corrêa\",\"doi\":\"10.18080/jtde.v11n2.690\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study presents a Systematic Literature Review (SLR) of Gender Bias in Artificial Intelligence (AI). The research was conducted using two techniques: a domain-based approach to SLR process providing a bibliometric sample description and in-depth examination of the thematic categories arising from inductive categorization, extracted from reading and interpretation of the final 35 sample articles analyzed. In answering three key research questions on the types, causes, and overcoming (mitigating) strategies of gender bias in artificial intelligence, three thematic treemaps were constructed, enabling systematic overview as an essential contribution to the literature. The main types of gender bias found in AI are categorized as societal, technical, and individual. Societal and socio-technical aspects stand out as the leading causes of bias, while debiasing, dataset design and gender sensitivity were the most frequent among the main strategies for overcoming bias. The study also proposes theoretical, practical and managerial capacity building and policy implications that aim to influence broad socio-technical challenges and refer to changes necessary, aiming to create bias-free artificial intelligence.\",\"PeriodicalId\":37752,\"journal\":{\"name\":\"Australian Journal of Telecommunications and the Digital Economy\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Australian Journal of Telecommunications and the Digital Economy\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18080/jtde.v11n2.690\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Australian Journal of Telecommunications and the Digital Economy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18080/jtde.v11n2.690","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 2

摘要

本研究对人工智能(AI)中的性别偏见进行了系统文献综述。本研究采用了两种技术:基于领域的单反过程方法(提供文献计量样本描述)和深入检查归纳分类产生的主题类别(从阅读和解释最终分析的35篇样本文章中提取)。在回答关于人工智能中性别偏见的类型、原因和克服(减轻)策略的三个关键研究问题时,构建了三个主题树图,使系统概述成为文献的重要贡献。在人工智能中发现的主要性别偏见类型分为社会、技术和个人。社会和社会技术方面是造成偏见的主要原因,而消除偏见、数据集设计和性别敏感性是克服偏见的主要策略中最常见的。该研究还提出了理论、实践和管理能力建设以及政策影响,旨在影响广泛的社会技术挑战,并提出必要的变革,旨在创造无偏见的人工智能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Gender Bias in Artificial Intelligence
This study presents a Systematic Literature Review (SLR) of Gender Bias in Artificial Intelligence (AI). The research was conducted using two techniques: a domain-based approach to SLR process providing a bibliometric sample description and in-depth examination of the thematic categories arising from inductive categorization, extracted from reading and interpretation of the final 35 sample articles analyzed. In answering three key research questions on the types, causes, and overcoming (mitigating) strategies of gender bias in artificial intelligence, three thematic treemaps were constructed, enabling systematic overview as an essential contribution to the literature. The main types of gender bias found in AI are categorized as societal, technical, and individual. Societal and socio-technical aspects stand out as the leading causes of bias, while debiasing, dataset design and gender sensitivity were the most frequent among the main strategies for overcoming bias. The study also proposes theoretical, practical and managerial capacity building and policy implications that aim to influence broad socio-technical challenges and refer to changes necessary, aiming to create bias-free artificial intelligence.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.60
自引率
0.00%
发文量
37
期刊介绍: The Journal of Telecommunications and the Digital Economy (JTDE) is an international, open-access, high quality, peer reviewed journal, indexed by Scopus and Google Scholar, covering innovative research and practice in Telecommunications, Digital Economy and Applications. The mission of JTDE is to further through publication the objective of advancing learning, knowledge and research worldwide. The JTDE publishes peer reviewed papers that may take the following form: *Research Paper - a paper making an original contribution to engineering knowledge. *Special Interest Paper – a report on significant aspects of a major or notable project. *Review Paper for specialists – an overview of a relevant area intended for specialists in the field covered. *Review Paper for non-specialists – an overview of a relevant area suitable for a reader with an electrical/electronics background. *Public Policy Discussion - a paper that identifies or discusses public policy and includes investigation of legislation, regulation and what is happening around the world including best practice *Tutorial Paper – a paper that explains an important subject or clarifies the approach to an area of design or investigation. *Technical Note – a technical note or letter to the Editors that is not sufficiently developed or extensive in scope to constitute a full paper. *Industry Case Study - a paper that provides details of industry practices utilising a case study to provide an understanding of what is occurring and how the outcomes have been achieved. *Discussion – a contribution to discuss a published paper to which the original author''s response will be sought. Historical - a paper covering a historical topic related to telecommunications or the digital economy.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信