网络学习环境下双源数据对情感计算的融合效应——基于注意机制的卷积神经网络

IF 4 2区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH
Xuesong Zhai, Jiaqi Xu, Nian-Shing Chen, Jun Shen, Yan Li, Yonggu Wang, Xiaoyan Chu, Yumeng Zhu
{"title":"网络学习环境下双源数据对情感计算的融合效应——基于注意机制的卷积神经网络","authors":"Xuesong Zhai, Jiaqi Xu, Nian-Shing Chen, Jun Shen, Yan Li, Yonggu Wang, Xiaoyan Chu, Yumeng Zhu","doi":"10.1177/07356331221115663","DOIUrl":null,"url":null,"abstract":"Affective computing (AC) has been regarded as a relevant approach to identifying online learners’ mental states and predicting their learning performance. Previous research mainly used one single-source data set, typically learners’ facial expression, to compute learners’ affection. However, a single facial expression may represent different affections in various head poses. This study proposed a dual-source data approach to solve the problem. Facial expression and head pose are two typical data sources that can be captured from online learning videos. The current study collected a dual-source data set of facial expressions and head poses from an online learning class in a middle school. A deep learning neural network using AlexNet with an attention mechanism was developed to verify the syncretic effect on affective computing of the proposed dual-source fusion strategy. The results show that the dual-source fusion approach significantly outperforms the single-source approach based on the AC recognition accuracy between the two approaches (dual-source approach using Attention-AlexNet model 80.96%; single-source approach, facial expression 76.65% and head pose 64.34%). This study contributes to the theoretical construction of the dual-source data fusion approach, and the empirical validation of the effect of the Attention-AlexNet neural network approach on affective computing in online learning contexts.","PeriodicalId":47865,"journal":{"name":"Journal of Educational Computing Research","volume":"61 1","pages":"466 - 493"},"PeriodicalIF":4.0000,"publicationDate":"2022-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Syncretic Effect of Dual-Source Data on Affective Computing in Online Learning Contexts: A Perspective From Convolutional Neural Network With Attention Mechanism\",\"authors\":\"Xuesong Zhai, Jiaqi Xu, Nian-Shing Chen, Jun Shen, Yan Li, Yonggu Wang, Xiaoyan Chu, Yumeng Zhu\",\"doi\":\"10.1177/07356331221115663\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Affective computing (AC) has been regarded as a relevant approach to identifying online learners’ mental states and predicting their learning performance. Previous research mainly used one single-source data set, typically learners’ facial expression, to compute learners’ affection. However, a single facial expression may represent different affections in various head poses. This study proposed a dual-source data approach to solve the problem. Facial expression and head pose are two typical data sources that can be captured from online learning videos. The current study collected a dual-source data set of facial expressions and head poses from an online learning class in a middle school. A deep learning neural network using AlexNet with an attention mechanism was developed to verify the syncretic effect on affective computing of the proposed dual-source fusion strategy. The results show that the dual-source fusion approach significantly outperforms the single-source approach based on the AC recognition accuracy between the two approaches (dual-source approach using Attention-AlexNet model 80.96%; single-source approach, facial expression 76.65% and head pose 64.34%). This study contributes to the theoretical construction of the dual-source data fusion approach, and the empirical validation of the effect of the Attention-AlexNet neural network approach on affective computing in online learning contexts.\",\"PeriodicalId\":47865,\"journal\":{\"name\":\"Journal of Educational Computing Research\",\"volume\":\"61 1\",\"pages\":\"466 - 493\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2022-11-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Educational Computing Research\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1177/07356331221115663\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Computing Research","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1177/07356331221115663","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

摘要

情感计算(AC)被认为是识别在线学习者心理状态和预测其学习成绩的一种相关方法。以往的研究主要使用单一来源的数据集,通常是学习者的面部表情,来计算学习者的情感。然而,在不同的头部姿势中,单一的面部表情可能代表不同的情感。本研究提出了一种双源数据的方法来解决这一问题。面部表情和头部姿势是可以从在线学习视频中捕捉到的两个典型数据源。目前的研究收集了一组来自中学在线学习班的面部表情和头部姿势的双源数据。使用具有注意力机制的AlexNet开发了一个深度学习神经网络,以验证所提出的双源融合策略对情感计算的融合效果。结果表明,基于两种方法之间的AC识别准确率,双源融合方法显著优于单源方法(使用Attention AlexNet模型的双源方法为80.96%;单源方法为面部表情76.65%和头部姿态64.34%),以及注意力AlexNet神经网络方法在在线学习环境中对情感计算的影响的实证验证。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The Syncretic Effect of Dual-Source Data on Affective Computing in Online Learning Contexts: A Perspective From Convolutional Neural Network With Attention Mechanism
Affective computing (AC) has been regarded as a relevant approach to identifying online learners’ mental states and predicting their learning performance. Previous research mainly used one single-source data set, typically learners’ facial expression, to compute learners’ affection. However, a single facial expression may represent different affections in various head poses. This study proposed a dual-source data approach to solve the problem. Facial expression and head pose are two typical data sources that can be captured from online learning videos. The current study collected a dual-source data set of facial expressions and head poses from an online learning class in a middle school. A deep learning neural network using AlexNet with an attention mechanism was developed to verify the syncretic effect on affective computing of the proposed dual-source fusion strategy. The results show that the dual-source fusion approach significantly outperforms the single-source approach based on the AC recognition accuracy between the two approaches (dual-source approach using Attention-AlexNet model 80.96%; single-source approach, facial expression 76.65% and head pose 64.34%). This study contributes to the theoretical construction of the dual-source data fusion approach, and the empirical validation of the effect of the Attention-AlexNet neural network approach on affective computing in online learning contexts.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Educational Computing Research
Journal of Educational Computing Research EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
11.90
自引率
6.20%
发文量
69
期刊介绍: The goal of this Journal is to provide an international scholarly publication forum for peer-reviewed interdisciplinary research into the applications, effects, and implications of computer-based education. The Journal features articles useful for practitioners and theorists alike. The terms "education" and "computing" are viewed broadly. “Education” refers to the use of computer-based technologies at all levels of the formal education system, business and industry, home-schooling, lifelong learning, and unintentional learning environments. “Computing” refers to all forms of computer applications and innovations - both hardware and software. For example, this could range from mobile and ubiquitous computing to immersive 3D simulations and games to computing-enhanced virtual learning environments.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信