Eschewing Gender Stereotypes in Voice Assistants to Promote Inclusion

A. Danielescu
{"title":"Eschewing Gender Stereotypes in Voice Assistants to Promote Inclusion","authors":"A. Danielescu","doi":"10.1145/3405755.3406151","DOIUrl":null,"url":null,"abstract":"The wide adoption of conversational voice assistants has shaped how we interact with this technology while simultaneously highlighting and reinforcing negative stereotypes. For example, conversational systems often use female voices in subservient roles. They also exclude marginalized groups, such as non-binary individuals, altogether. Speech recognition systems also have significant gender, race and dialectal biases [14, 15, 19]. Instead, there is an opportunity for these systems to help change gender norms and promote inclusion and diversity as we continue to struggle with gender equality [10], and progress towards LGBTQ+ rights across the globe [13]. However, prior research claims that users strongly dislike voices without clear gender markers or misalignments between a voice and personality [12]. This calls for additional research to understand how voice assistants may be designed to not perpetuate gender bias while promoting user adoption.","PeriodicalId":380130,"journal":{"name":"Proceedings of the 2nd Conference on Conversational User Interfaces","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd Conference on Conversational User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3405755.3406151","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29

Abstract

The wide adoption of conversational voice assistants has shaped how we interact with this technology while simultaneously highlighting and reinforcing negative stereotypes. For example, conversational systems often use female voices in subservient roles. They also exclude marginalized groups, such as non-binary individuals, altogether. Speech recognition systems also have significant gender, race and dialectal biases [14, 15, 19]. Instead, there is an opportunity for these systems to help change gender norms and promote inclusion and diversity as we continue to struggle with gender equality [10], and progress towards LGBTQ+ rights across the globe [13]. However, prior research claims that users strongly dislike voices without clear gender markers or misalignments between a voice and personality [12]. This calls for additional research to understand how voice assistants may be designed to not perpetuate gender bias while promoting user adoption.
避免语音助手中的性别刻板印象,促进包容性
对话语音助手的广泛采用塑造了我们与这项技术的互动方式,同时也凸显和强化了负面的刻板印象。例如,会话系统经常在从属角色中使用女性的声音。他们也完全排除了边缘群体,比如非二元个体。语音识别系统也存在明显的性别、种族和方言偏见[14,15,19]。相反,随着我们继续努力实现性别平等[10],并在全球范围内推进LGBTQ+权利[13],这些系统有机会帮助改变性别规范,促进包容性和多样性。然而,先前的研究表明,用户非常不喜欢没有明确性别标记或声音与个性不一致的声音[12]。这需要进一步的研究,以了解如何设计语音助手,在促进用户采用的同时,避免性别偏见。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信