{"title":"Risks of Bias in AI-Based Emotional Analysis Technology from Diversity Perspectives","authors":"Sumiko Shimo","doi":"10.1109/ISTAS50296.2020.9462168","DOIUrl":null,"url":null,"abstract":"Emotion AI technology captures emotional reactions in real-time and decodes both verbal and non-verbal emotional behaviors of people, promising better service experience, devices, and technologies. On the other hand, it is observed that emotion AI systems are developed and used to assume that there is little variation in emotional expression across the human population. Insufficient understanding of human diversity and cultural diversity can lower the accuracy in detecting and interpreting underrepresented groups’ emotions, leading to biases, discrimination, and negative consequences in their lives. These growing concerns about emotion AI technology are potentially caused by a lack of workforce diversity in the AI sector and human diversity in the AI system. This paper explores workforce diversity and human diversity in AI systems, along with ethical concerns, and suggests possible paths forward.","PeriodicalId":196560,"journal":{"name":"2020 IEEE International Symposium on Technology and Society (ISTAS)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Technology and Society (ISTAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISTAS50296.2020.9462168","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Emotion AI technology captures emotional reactions in real-time and decodes both verbal and non-verbal emotional behaviors of people, promising better service experience, devices, and technologies. On the other hand, it is observed that emotion AI systems are developed and used to assume that there is little variation in emotional expression across the human population. Insufficient understanding of human diversity and cultural diversity can lower the accuracy in detecting and interpreting underrepresented groups’ emotions, leading to biases, discrimination, and negative consequences in their lives. These growing concerns about emotion AI technology are potentially caused by a lack of workforce diversity in the AI sector and human diversity in the AI system. This paper explores workforce diversity and human diversity in AI systems, along with ethical concerns, and suggests possible paths forward.