用数据驱动的方法模拟不同文化的微笑

Chaona Chen, Oliver G. B. Garrod, P. Schyns, Rachael E. Jack
{"title":"用数据驱动的方法模拟不同文化的微笑","authors":"Chaona Chen, Oliver G. B. Garrod, P. Schyns, Rachael E. Jack","doi":"10.1109/FG57933.2023.10042621","DOIUrl":null,"url":null,"abstract":"Smiling faces are often preferred in daily social interactions. Many socially interactive human-like virtual agents are equipped with the capability to produce standardized smiles that are widely considered to be universal. However, mounting evidence shows that people from different cultures prefer different smiles. To engage a culturally diverse range of human users, socially interactive human-like virtual agents must be equipped with culturally-valid dynamic facial expressions. To develop culturally sensitive smiles, we use data-driven, perception-based methods to model the facial expressions of happy in 60 individuals in two distinct cultures (East Asian and Western European). On each experimental trial, we generated a random facial animation composed of a random sub-set of individual face movements (i.e., AUs), each with a random movement. Each cultural participant categorized 2400 such facial animations according to an emotion label (e.g., happy) if appropriate, otherwise selecting ‘other.’ We derived facial expression models of happy for each cultural participant by measuring the statistical relationship between the dynamic AUs presented on each trial and each participant's responses. Analysis of the facial expression models revealed clear cross-cultural similarity and diversity in smiles–for example, smiling with raised cheeks (AU12-6) is culturally common, while open-mouth smiling (AU25-12) is Western-specific and smiling with eyebrow raising (AU1-2) is East Asian-specific. Analysis of the temporal dynamics of each AU further revealed cultural diversity in smiles. We anticipate that our approach will improve the social signalling capabilities of socially interactive human-like virtual agents and broaden their usability in global market.","PeriodicalId":318766,"journal":{"name":"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Modelling Culturally Diverse Smiles Using Data-Driven Methods\",\"authors\":\"Chaona Chen, Oliver G. B. Garrod, P. Schyns, Rachael E. Jack\",\"doi\":\"10.1109/FG57933.2023.10042621\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Smiling faces are often preferred in daily social interactions. Many socially interactive human-like virtual agents are equipped with the capability to produce standardized smiles that are widely considered to be universal. However, mounting evidence shows that people from different cultures prefer different smiles. To engage a culturally diverse range of human users, socially interactive human-like virtual agents must be equipped with culturally-valid dynamic facial expressions. To develop culturally sensitive smiles, we use data-driven, perception-based methods to model the facial expressions of happy in 60 individuals in two distinct cultures (East Asian and Western European). On each experimental trial, we generated a random facial animation composed of a random sub-set of individual face movements (i.e., AUs), each with a random movement. Each cultural participant categorized 2400 such facial animations according to an emotion label (e.g., happy) if appropriate, otherwise selecting ‘other.’ We derived facial expression models of happy for each cultural participant by measuring the statistical relationship between the dynamic AUs presented on each trial and each participant's responses. Analysis of the facial expression models revealed clear cross-cultural similarity and diversity in smiles–for example, smiling with raised cheeks (AU12-6) is culturally common, while open-mouth smiling (AU25-12) is Western-specific and smiling with eyebrow raising (AU1-2) is East Asian-specific. Analysis of the temporal dynamics of each AU further revealed cultural diversity in smiles. We anticipate that our approach will improve the social signalling capabilities of socially interactive human-like virtual agents and broaden their usability in global market.\",\"PeriodicalId\":318766,\"journal\":{\"name\":\"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FG57933.2023.10042621\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FG57933.2023.10042621","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在日常的社交活动中,人们更喜欢微笑的面孔。许多具有社交互动功能的类人虚拟代理都具备产生被广泛认为是通用的标准化微笑的能力。然而,越来越多的证据表明,不同文化背景的人喜欢不同的微笑。为了吸引文化多样化的人类用户,社交互动的类人虚拟代理必须配备文化有效的动态面部表情。为了开发具有文化敏感性的微笑,我们使用数据驱动的、基于感知的方法来模拟两种不同文化(东亚和西欧)的60个人的快乐面部表情。在每次实验中,我们生成一个随机的面部动画,该动画由单个面部运动的随机子集(即AUs)组成,每个动作都有一个随机的运动。每个文化参与者根据适当的情绪标签(例如,快乐)对2400个这样的面部动画进行分类,否则选择“其他”。“我们通过测量每个试验中呈现的动态au与每个参与者的反应之间的统计关系,得出了每个文化参与者的快乐面部表情模型。”对面部表情模型的分析揭示了微笑的跨文化相似性和多样性——例如,抬起脸颊微笑(AU12-6)在文化上是常见的,而张开嘴微笑(AU25-12)是西方特有的,而抬起眉毛微笑(AU1-2)是东亚特有的。对每个AU的时间动态分析进一步揭示了微笑的文化多样性。我们预计我们的方法将提高社会互动类人虚拟代理的社会信号能力,并扩大其在全球市场的可用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Modelling Culturally Diverse Smiles Using Data-Driven Methods
Smiling faces are often preferred in daily social interactions. Many socially interactive human-like virtual agents are equipped with the capability to produce standardized smiles that are widely considered to be universal. However, mounting evidence shows that people from different cultures prefer different smiles. To engage a culturally diverse range of human users, socially interactive human-like virtual agents must be equipped with culturally-valid dynamic facial expressions. To develop culturally sensitive smiles, we use data-driven, perception-based methods to model the facial expressions of happy in 60 individuals in two distinct cultures (East Asian and Western European). On each experimental trial, we generated a random facial animation composed of a random sub-set of individual face movements (i.e., AUs), each with a random movement. Each cultural participant categorized 2400 such facial animations according to an emotion label (e.g., happy) if appropriate, otherwise selecting ‘other.’ We derived facial expression models of happy for each cultural participant by measuring the statistical relationship between the dynamic AUs presented on each trial and each participant's responses. Analysis of the facial expression models revealed clear cross-cultural similarity and diversity in smiles–for example, smiling with raised cheeks (AU12-6) is culturally common, while open-mouth smiling (AU25-12) is Western-specific and smiling with eyebrow raising (AU1-2) is East Asian-specific. Analysis of the temporal dynamics of each AU further revealed cultural diversity in smiles. We anticipate that our approach will improve the social signalling capabilities of socially interactive human-like virtual agents and broaden their usability in global market.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信