Analysis of Students' Positive Emotion and Smile Intensity Using Sequence-Relative Key-Frame Labeling and Deep-Asymmetric Convolutional Neural Network

IF 15.3 1区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
Zhenzhen Luo;Xiaolu Jin;Yong Luo;Qiangqiang Zhou;Xin Luo
{"title":"Analysis of Students' Positive Emotion and Smile Intensity Using Sequence-Relative Key-Frame Labeling and Deep-Asymmetric Convolutional Neural Network","authors":"Zhenzhen Luo;Xiaolu Jin;Yong Luo;Qiangqiang Zhou;Xin Luo","doi":"10.1109/JAS.2024.125016","DOIUrl":null,"url":null,"abstract":"Positive emotional experiences can improve learning efficiency and cognitive ability, stimulate students' interest in learning, and improve teacher-student relationships. However, positive emotions in the classroom are primarily identified through teachers' observations and postclass questionnaires or interviews. The expression intensity of students, which is extremely important for fine-grained emotion analysis, is not considered. Hence, a novel method based on smile intensity estimation using sequence-relative key-frame labeling is presented. This method aims to recognize the positive emotion levels of a student in an end-to-end framework. First, the intensity label is generated robustly for each frame in the expression sequence based on the relative key frames to address the lack of annotations for smile intensity. Then, a deep-asymmetric convolutional neural network learns the expression model through dual neural networks, to enhance the stability of the network model and avoid the extreme attention region learned. Further, dual neural networks and the dual attention mechanism are integrated using the intensity label based on the relative key frames as the supervised information. Thus, diverse features are effectively extracted and subtle appearance differences between different smiles are perceived based on different perspectives. Finally, comparative experiments for the convergence speed, model-training parameters, confusion matrix, and classification probability are performed. The proposed method was applied to a real classroom scene to analyze the emotions of students. Numerous experiments validated that the proposed method is promising for analyzing the differences in the positive emotion of students while learning in a classroom.","PeriodicalId":54230,"journal":{"name":"Ieee-Caa Journal of Automatica Sinica","volume":"12 4","pages":"806-820"},"PeriodicalIF":15.3000,"publicationDate":"2025-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ieee-Caa Journal of Automatica Sinica","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10946009/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Positive emotional experiences can improve learning efficiency and cognitive ability, stimulate students' interest in learning, and improve teacher-student relationships. However, positive emotions in the classroom are primarily identified through teachers' observations and postclass questionnaires or interviews. The expression intensity of students, which is extremely important for fine-grained emotion analysis, is not considered. Hence, a novel method based on smile intensity estimation using sequence-relative key-frame labeling is presented. This method aims to recognize the positive emotion levels of a student in an end-to-end framework. First, the intensity label is generated robustly for each frame in the expression sequence based on the relative key frames to address the lack of annotations for smile intensity. Then, a deep-asymmetric convolutional neural network learns the expression model through dual neural networks, to enhance the stability of the network model and avoid the extreme attention region learned. Further, dual neural networks and the dual attention mechanism are integrated using the intensity label based on the relative key frames as the supervised information. Thus, diverse features are effectively extracted and subtle appearance differences between different smiles are perceived based on different perspectives. Finally, comparative experiments for the convergence speed, model-training parameters, confusion matrix, and classification probability are performed. The proposed method was applied to a real classroom scene to analyze the emotions of students. Numerous experiments validated that the proposed method is promising for analyzing the differences in the positive emotion of students while learning in a classroom.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Ieee-Caa Journal of Automatica Sinica
Ieee-Caa Journal of Automatica Sinica Engineering-Control and Systems Engineering
CiteScore
23.50
自引率
11.00%
发文量
880
期刊介绍: The IEEE/CAA Journal of Automatica Sinica is a reputable journal that publishes high-quality papers in English on original theoretical/experimental research and development in the field of automation. The journal covers a wide range of topics including automatic control, artificial intelligence and intelligent control, systems theory and engineering, pattern recognition and intelligent systems, automation engineering and applications, information processing and information systems, network-based automation, robotics, sensing and measurement, and navigation, guidance, and control. Additionally, the journal is abstracted/indexed in several prominent databases including SCIE (Science Citation Index Expanded), EI (Engineering Index), Inspec, Scopus, SCImago, DBLP, CNKI (China National Knowledge Infrastructure), CSCD (Chinese Science Citation Database), and IEEE Xplore.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信