{"title":"Emotion classification with multi-modal physiological signals using multi-attention-based neural network","authors":"Chengsheng Zou, Zhen Deng, Bingwei He, Maosong Yan, Jie Wu, Zhaoju Zhu","doi":"10.1049/ccs2.12107","DOIUrl":null,"url":null,"abstract":"<p>The ability to effectively classify human emotion states is critically important for human-computer or human-robot interactions. However, emotion classification with physiological signals is still a challenging problem due to the diversity of emotion expression and the characteristic differences in different modal signals. A novel learning-based network architecture is presented that can exploit four-modal physiological signals, electrocardiogram, electrodermal activity, electromyography, and blood volume pulse, and make a classification of emotion states. It features two kinds of attention modules, feature-level, and semantic-level, which drive the network to focus on the information-rich features by mimicking the human attention mechanism. The feature-level attention module encodes the rich information of each physiological signal. While the semantic-level attention module captures the semantic dependencies among modals. The performance of the designed network is evaluated with the open-source Wearable Stress and Affect Detection dataset. The developed emotion classification system achieves an accuracy of 83.88%. Results demonstrated that the proposed network could effectively process four-modal physiological signals and achieve high accuracy of emotion classification.</p>","PeriodicalId":33652,"journal":{"name":"Cognitive Computation and Systems","volume":"6 1-3","pages":"1-11"},"PeriodicalIF":1.2000,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/ccs2.12107","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computation and Systems","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/ccs2.12107","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The ability to effectively classify human emotion states is critically important for human-computer or human-robot interactions. However, emotion classification with physiological signals is still a challenging problem due to the diversity of emotion expression and the characteristic differences in different modal signals. A novel learning-based network architecture is presented that can exploit four-modal physiological signals, electrocardiogram, electrodermal activity, electromyography, and blood volume pulse, and make a classification of emotion states. It features two kinds of attention modules, feature-level, and semantic-level, which drive the network to focus on the information-rich features by mimicking the human attention mechanism. The feature-level attention module encodes the rich information of each physiological signal. While the semantic-level attention module captures the semantic dependencies among modals. The performance of the designed network is evaluated with the open-source Wearable Stress and Affect Detection dataset. The developed emotion classification system achieves an accuracy of 83.88%. Results demonstrated that the proposed network could effectively process four-modal physiological signals and achieve high accuracy of emotion classification.