{"title":"Your Body Reveals Your Impressions about Others: A Study on Multimodal Impression Detection","authors":"Chen Wang, T. Pun, G. Chanel","doi":"10.1109/ACIIW.2019.8925052","DOIUrl":null,"url":null,"abstract":"Formed impressions are crucial for human-human interaction (e.g. a job interview) and an interaction with a virtual agent/robot, since they can impact people's perceptions and willingness to be involved in the interaction. There are studies on how facial features (e.g. skin color, face shape), acoustic signals and non-verbal behaviors (e.g. gestures, postures) create/leave certain impressions. However there is little research focusing on how our bodies disclose our already formed impression of someone. Forming an impression leads to emotions and behaviors which can be measured. In this paper, we investigate recognition of evoked impression of warmth and competence from the nonverbal behaviors expressed by the person forming the impression. We conducted an experiment in which participants were watching impression stimuli. We measured participant's facial expressions, eye movements and physiological reactions (electrocardiography and galvanic skin response). To recognize impressions, we tested 2 multivariate regression models with the aforementioned multimodal recordings. Our best results demonstrate the possibility to detect impressions along warmth and competence dimensions with a concordance correlation coefficient of 0.838 and 0.864. Facial expressions and eye movements are more reliable for impression detection compared with physiological signals. Finally, the higher the Berkeley emotion expressivity scores the participants have, the more accurately the impressions are detected.","PeriodicalId":193568,"journal":{"name":"2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACIIW.2019.8925052","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Formed impressions are crucial for human-human interaction (e.g. a job interview) and an interaction with a virtual agent/robot, since they can impact people's perceptions and willingness to be involved in the interaction. There are studies on how facial features (e.g. skin color, face shape), acoustic signals and non-verbal behaviors (e.g. gestures, postures) create/leave certain impressions. However there is little research focusing on how our bodies disclose our already formed impression of someone. Forming an impression leads to emotions and behaviors which can be measured. In this paper, we investigate recognition of evoked impression of warmth and competence from the nonverbal behaviors expressed by the person forming the impression. We conducted an experiment in which participants were watching impression stimuli. We measured participant's facial expressions, eye movements and physiological reactions (electrocardiography and galvanic skin response). To recognize impressions, we tested 2 multivariate regression models with the aforementioned multimodal recordings. Our best results demonstrate the possibility to detect impressions along warmth and competence dimensions with a concordance correlation coefficient of 0.838 and 0.864. Facial expressions and eye movements are more reliable for impression detection compared with physiological signals. Finally, the higher the Berkeley emotion expressivity scores the participants have, the more accurately the impressions are detected.