{"title":"基于注意机制和BiLSTM融合的学生注意状态分类","authors":"Chen Li, Qing Yang, Ming Li, Dou Wen, Yaqun Wang","doi":"10.1109/IEIR56323.2022.10050046","DOIUrl":null,"url":null,"abstract":"At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.","PeriodicalId":183709,"journal":{"name":"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Classification of Students’ Attentional States Using Attention Mechanism and BiLSTM Fusion\",\"authors\":\"Chen Li, Qing Yang, Ming Li, Dou Wen, Yaqun Wang\",\"doi\":\"10.1109/IEIR56323.2022.10050046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.\",\"PeriodicalId\":183709,\"journal\":{\"name\":\"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IEIR56323.2022.10050046\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Intelligent Education and Intelligent Research (IEIR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEIR56323.2022.10050046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Classification of Students’ Attentional States Using Attention Mechanism and BiLSTM Fusion
At present, most deep learning-based analysis of student’s attentional states in class has been studied only for a single model structure, and there is not enough recognition accuracy. To address this issue, an attention classification model FF-BiALSTM is proposed, which integrates an Attention Mechanism and a bi-directional long short-term memory neural network (Bi-LSTM). The Attention Mechanism is used to capture global features better and two Bi-LSTM layers are employed to capture time-domain features more effectively. This study defined two attention states to identify whether students are focused or not. Experiments on the Student EEG and Student Reading datasets show that this algorithm can effectively improve student attention classification performance. This experiment obtained 97.77% accuracy on the Student EEG training set and 91.35% on the Student EEG testing set.