Hana Boukricha, I. Wachsmuth, A. Hofstätter, K. Grammer
{"title":"愉悦-唤醒-支配驱动的面部表情模拟","authors":"Hana Boukricha, I. Wachsmuth, A. Hofstätter, K. Grammer","doi":"10.1109/ACII.2009.5349579","DOIUrl":null,"url":null,"abstract":"Expressing and recognizing affective states with respect to facial expressions is an important aspect in perceiving virtual humans as more natural and believable. Based on the results of an empirical study a system for simulating emotional facial expressions for a virtual human has been evolved. This system consists of two parts: (1) a control architecture for simulating emotional facial expressions with respect to Pleasure, Arousal, and Dominance (PAD) values, (2) an expressive output component for animating the virtual human's facial muscle actions called Action Units (AUs), modeled following the Facial Action Coding System (FACS). A large face repertoire of about 6000 faces arranged in PAD-space with respect to two dominance values (dominant vs. submissive) is obtained as a result of the empirical study. Using the face repertoire an approach towards realizing facial mimicry for a virtual human based on backward mapping AUs displaying an emotional facial expression on PAD-values is outlined. A preliminary evaluation of this first approach is realized with AUs corresponding to the basic emotions Happy and Angry.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":"{\"title\":\"Pleasure-arousal-dominance driven facial expression simulation\",\"authors\":\"Hana Boukricha, I. Wachsmuth, A. Hofstätter, K. Grammer\",\"doi\":\"10.1109/ACII.2009.5349579\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Expressing and recognizing affective states with respect to facial expressions is an important aspect in perceiving virtual humans as more natural and believable. Based on the results of an empirical study a system for simulating emotional facial expressions for a virtual human has been evolved. This system consists of two parts: (1) a control architecture for simulating emotional facial expressions with respect to Pleasure, Arousal, and Dominance (PAD) values, (2) an expressive output component for animating the virtual human's facial muscle actions called Action Units (AUs), modeled following the Facial Action Coding System (FACS). A large face repertoire of about 6000 faces arranged in PAD-space with respect to two dominance values (dominant vs. submissive) is obtained as a result of the empirical study. Using the face repertoire an approach towards realizing facial mimicry for a virtual human based on backward mapping AUs displaying an emotional facial expression on PAD-values is outlined. A preliminary evaluation of this first approach is realized with AUs corresponding to the basic emotions Happy and Angry.\",\"PeriodicalId\":330737,\"journal\":{\"name\":\"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-12-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"34\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACII.2009.5349579\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACII.2009.5349579","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Expressing and recognizing affective states with respect to facial expressions is an important aspect in perceiving virtual humans as more natural and believable. Based on the results of an empirical study a system for simulating emotional facial expressions for a virtual human has been evolved. This system consists of two parts: (1) a control architecture for simulating emotional facial expressions with respect to Pleasure, Arousal, and Dominance (PAD) values, (2) an expressive output component for animating the virtual human's facial muscle actions called Action Units (AUs), modeled following the Facial Action Coding System (FACS). A large face repertoire of about 6000 faces arranged in PAD-space with respect to two dominance values (dominant vs. submissive) is obtained as a result of the empirical study. Using the face repertoire an approach towards realizing facial mimicry for a virtual human based on backward mapping AUs displaying an emotional facial expression on PAD-values is outlined. A preliminary evaluation of this first approach is realized with AUs corresponding to the basic emotions Happy and Angry.