{"title":"Emotion Recognition and Intervention Technology for Autistic Children Based on the Fusion of Neural Networks and Biological Signals","authors":"Yifei Wang","doi":"10.1016/j.procs.2025.04.243","DOIUrl":null,"url":null,"abstract":"<div><div>Given the significant difficulties that children with autism face in emotion recognition and intervention, there is an urgent need to develop accurate and efficient technical means to improve their social interaction and emotional understanding abilities. This study discusses a biological signal emotion recognition and intervention technology that integrates Convolutional Neural Network (CNN) and Long Short-Term Memory Network (LSTM). First, this paper collects a variety of biological signal data of autistic children in different emotional states, including heart rate, galvanic skin response (GSR) and electroencephalogram (EEG), and preprocesses and extracts features of the data. Next, this paper builds and trains a deep learning model that integrates CNN and LSTM, classifies and analyzes the extracted features into emotional states, and achieves high-precision emotion recognition. Finally, this paper designs personalized intervention strategies based on the recognition results, and provides emotional guidance and intervention to children through a real-time feedback system. In the experimental conclusion, the accuracy of emotion recognition of the proposed fusion model in the training set and the verification set is 97.5% and 94.2% respectively, which is significantly better than the single mode signal processing method. In addition, the personalized intervention strategy based on this model achieved improvements of 45%, 3.8 points, and 4.2 points in reducing the amplitude of emotional fluctuations, enhancing emotional regulation ability, and improving social behavior, respectively, demonstrating the significant advantages and application potential of multimodal biosignal fusion in improving emotion recognition and intervention effects in children with autism.</div></div>","PeriodicalId":20465,"journal":{"name":"Procedia Computer Science","volume":"261 ","pages":"Pages 538-547"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Procedia Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1877050925013456","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Given the significant difficulties that children with autism face in emotion recognition and intervention, there is an urgent need to develop accurate and efficient technical means to improve their social interaction and emotional understanding abilities. This study discusses a biological signal emotion recognition and intervention technology that integrates Convolutional Neural Network (CNN) and Long Short-Term Memory Network (LSTM). First, this paper collects a variety of biological signal data of autistic children in different emotional states, including heart rate, galvanic skin response (GSR) and electroencephalogram (EEG), and preprocesses and extracts features of the data. Next, this paper builds and trains a deep learning model that integrates CNN and LSTM, classifies and analyzes the extracted features into emotional states, and achieves high-precision emotion recognition. Finally, this paper designs personalized intervention strategies based on the recognition results, and provides emotional guidance and intervention to children through a real-time feedback system. In the experimental conclusion, the accuracy of emotion recognition of the proposed fusion model in the training set and the verification set is 97.5% and 94.2% respectively, which is significantly better than the single mode signal processing method. In addition, the personalized intervention strategy based on this model achieved improvements of 45%, 3.8 points, and 4.2 points in reducing the amplitude of emotional fluctuations, enhancing emotional regulation ability, and improving social behavior, respectively, demonstrating the significant advantages and application potential of multimodal biosignal fusion in improving emotion recognition and intervention effects in children with autism.