{"title":"Eschewing Gender Stereotypes in Voice Assistants to Promote Inclusion","authors":"A. Danielescu","doi":"10.1145/3405755.3406151","DOIUrl":null,"url":null,"abstract":"The wide adoption of conversational voice assistants has shaped how we interact with this technology while simultaneously highlighting and reinforcing negative stereotypes. For example, conversational systems often use female voices in subservient roles. They also exclude marginalized groups, such as non-binary individuals, altogether. Speech recognition systems also have significant gender, race and dialectal biases [14, 15, 19]. Instead, there is an opportunity for these systems to help change gender norms and promote inclusion and diversity as we continue to struggle with gender equality [10], and progress towards LGBTQ+ rights across the globe [13]. However, prior research claims that users strongly dislike voices without clear gender markers or misalignments between a voice and personality [12]. This calls for additional research to understand how voice assistants may be designed to not perpetuate gender bias while promoting user adoption.","PeriodicalId":380130,"journal":{"name":"Proceedings of the 2nd Conference on Conversational User Interfaces","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd Conference on Conversational User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3405755.3406151","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29
Abstract
The wide adoption of conversational voice assistants has shaped how we interact with this technology while simultaneously highlighting and reinforcing negative stereotypes. For example, conversational systems often use female voices in subservient roles. They also exclude marginalized groups, such as non-binary individuals, altogether. Speech recognition systems also have significant gender, race and dialectal biases [14, 15, 19]. Instead, there is an opportunity for these systems to help change gender norms and promote inclusion and diversity as we continue to struggle with gender equality [10], and progress towards LGBTQ+ rights across the globe [13]. However, prior research claims that users strongly dislike voices without clear gender markers or misalignments between a voice and personality [12]. This calls for additional research to understand how voice assistants may be designed to not perpetuate gender bias while promoting user adoption.