{"title":"动态面部情绪表达自我呈现预测自尊。","authors":"Xinlei Zang, Juan Yang","doi":"10.3390/bs15050709","DOIUrl":null,"url":null,"abstract":"<p><p>There is a close relationship between self-esteem and emotions. However, most studies have relied on self-report measures, which primarily capture retrospective and generalized emotional tendencies, rather than spontaneous, momentary emotional expressions in real-time social interactions. Given that self-esteem also shapes how individuals regulate and express emotions in social contexts, it is crucial to examine whether and how self-esteem manifests in dynamic emotional expressions during self-presentation. In this study, we recorded the performances of 211 participants during a public self-presentation task using a digital video camera and measured their self-esteem scores with the Rosenberg Self-Esteem Scale. Facial Action Units (AUs) scores were extracted from each video frame using OpenFace, and four basic emotions-happiness, sadness, disgust, and fear-were quantified based on the basic emotion theory. Time-series analysis was then employed to capture the multidimensional dynamic features of these emotions. Finally, we applied machine learning and explainable AI to identify which dynamic emotional features were closely associated with self-esteem. The results indicate that all four basic emotions are closely associated with self-esteem. Therefore, this study introduces a new perspective on self-esteem assessment, highlighting the potential of nonverbal behavioral indicators as alternatives to traditional self-report measures.</p>","PeriodicalId":8742,"journal":{"name":"Behavioral Sciences","volume":"15 5","pages":""},"PeriodicalIF":2.5000,"publicationDate":"2025-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dynamic Facial Emotional Expressions in Self-Presentation Predicted Self-Esteem.\",\"authors\":\"Xinlei Zang, Juan Yang\",\"doi\":\"10.3390/bs15050709\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>There is a close relationship between self-esteem and emotions. However, most studies have relied on self-report measures, which primarily capture retrospective and generalized emotional tendencies, rather than spontaneous, momentary emotional expressions in real-time social interactions. Given that self-esteem also shapes how individuals regulate and express emotions in social contexts, it is crucial to examine whether and how self-esteem manifests in dynamic emotional expressions during self-presentation. In this study, we recorded the performances of 211 participants during a public self-presentation task using a digital video camera and measured their self-esteem scores with the Rosenberg Self-Esteem Scale. Facial Action Units (AUs) scores were extracted from each video frame using OpenFace, and four basic emotions-happiness, sadness, disgust, and fear-were quantified based on the basic emotion theory. Time-series analysis was then employed to capture the multidimensional dynamic features of these emotions. Finally, we applied machine learning and explainable AI to identify which dynamic emotional features were closely associated with self-esteem. The results indicate that all four basic emotions are closely associated with self-esteem. Therefore, this study introduces a new perspective on self-esteem assessment, highlighting the potential of nonverbal behavioral indicators as alternatives to traditional self-report measures.</p>\",\"PeriodicalId\":8742,\"journal\":{\"name\":\"Behavioral Sciences\",\"volume\":\"15 5\",\"pages\":\"\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2025-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Behavioral Sciences\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.3390/bs15050709\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PSYCHOLOGY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavioral Sciences","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3390/bs15050709","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
Dynamic Facial Emotional Expressions in Self-Presentation Predicted Self-Esteem.
There is a close relationship between self-esteem and emotions. However, most studies have relied on self-report measures, which primarily capture retrospective and generalized emotional tendencies, rather than spontaneous, momentary emotional expressions in real-time social interactions. Given that self-esteem also shapes how individuals regulate and express emotions in social contexts, it is crucial to examine whether and how self-esteem manifests in dynamic emotional expressions during self-presentation. In this study, we recorded the performances of 211 participants during a public self-presentation task using a digital video camera and measured their self-esteem scores with the Rosenberg Self-Esteem Scale. Facial Action Units (AUs) scores were extracted from each video frame using OpenFace, and four basic emotions-happiness, sadness, disgust, and fear-were quantified based on the basic emotion theory. Time-series analysis was then employed to capture the multidimensional dynamic features of these emotions. Finally, we applied machine learning and explainable AI to identify which dynamic emotional features were closely associated with self-esteem. The results indicate that all four basic emotions are closely associated with self-esteem. Therefore, this study introduces a new perspective on self-esteem assessment, highlighting the potential of nonverbal behavioral indicators as alternatives to traditional self-report measures.