基于注意机制和BiLSTM融合的学生注意状态分类

Chen Li, Qing Yang, Ming Li, Dou Wen, Yaqun Wang
{"title":"基于注意机制和BiLSTM融合的学生注意状态分类","authors":"Chen Li, Qing Yang, Ming Li, Dou Wen, Yaqun Wang","doi":"10.1109/IEIR56323.2022.10050046","DOIUrl":null,"url":null,"abstract":"At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.","PeriodicalId":183709,"journal":{"name":"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Classification of Students’ Attentional States Using Attention Mechanism and BiLSTM Fusion\",\"authors\":\"Chen Li, Qing Yang, Ming Li, Dou Wen, Yaqun Wang\",\"doi\":\"10.1109/IEIR56323.2022.10050046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.\",\"PeriodicalId\":183709,\"journal\":{\"name\":\"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IEIR56323.2022.10050046\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEIR56323.2022.10050046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

目前,基于深度学习的学生课堂注意力状态分析大多只针对单一的模型结构进行研究,识别精度不够。为了解决这一问题,提出了一种将注意机制和双向长短期记忆神经网络(Bi-LSTM)相结合的注意分类模型FF-BiALSTM。采用注意机制更好地捕获全局特征,采用双lstm层更有效地捕获时域特征。本研究定义了两种注意力状态来识别学生是否集中。在学生脑电图和学生阅读数据集上的实验表明,该算法可以有效地提高学生注意力分类性能。该实验在学生脑电图训练集上的准确率为97.77%,在学生脑电图测试集上的准确率为91.35%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Classification of Students’ Attentional States Using Attention Mechanism and BiLSTM Fusion
At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信