{"title":"基于辅助学习的分层变换呼吸情绪识别","authors":"Yong Wang;Chendong Xu;Weirui Na;Dongyu Liu;Jiuqi Yan;Shuai Yao;Qisong Wu","doi":"10.1109/JSEN.2025.3587271","DOIUrl":null,"url":null,"abstract":"Respiration is modulated by human emotional activity. Emotion recognition using physiological signals has recently gained considerable attention. However, most existing studies primarily focus on using electroencephalogram (EEG) signals for emotion recognition. This article explores the potential of utilizing respiration signals collected by wearable devices for emotion recognition. We propose a hierarchical transformer model to effectively extract emotional information from respiration signals. Furthermore, we introduce gender classification as an auxiliary task to further improve the accuracy of emotion recognition. Specifically, a frame transformer is employed to capture emotional information across frames of respiration signals. The extracted frame-level features are subsequently fused with segment-level embeddings through a specially designed fusion layer. Next, separate segment transformers are employed for emotion and gender to extract segment-level information, with a cosine similarity loss applied to promote shared feature learning. Finally, distinct classifiers are used for emotion and gender classification. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on the DEAP and MAHNOB-HCI datasets under a subject-independent setting. For the DEAP dataset, the average classification accuracies are 72.42% for valence and 73.91% for arousal, while for the MAHNOB-HCI dataset, they are 80.45% and 79.69%, respectively. Compared to other physiological signals, such as EEG, respiration signals exhibit comparable potential for emotion recognition.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 16","pages":"31290-31301"},"PeriodicalIF":4.3000,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Hierarchical Transformer With Auxiliary Learning for Subject-Independent Respiration Emotion Recognition\",\"authors\":\"Yong Wang;Chendong Xu;Weirui Na;Dongyu Liu;Jiuqi Yan;Shuai Yao;Qisong Wu\",\"doi\":\"10.1109/JSEN.2025.3587271\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Respiration is modulated by human emotional activity. Emotion recognition using physiological signals has recently gained considerable attention. However, most existing studies primarily focus on using electroencephalogram (EEG) signals for emotion recognition. This article explores the potential of utilizing respiration signals collected by wearable devices for emotion recognition. We propose a hierarchical transformer model to effectively extract emotional information from respiration signals. Furthermore, we introduce gender classification as an auxiliary task to further improve the accuracy of emotion recognition. Specifically, a frame transformer is employed to capture emotional information across frames of respiration signals. The extracted frame-level features are subsequently fused with segment-level embeddings through a specially designed fusion layer. Next, separate segment transformers are employed for emotion and gender to extract segment-level information, with a cosine similarity loss applied to promote shared feature learning. Finally, distinct classifiers are used for emotion and gender classification. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on the DEAP and MAHNOB-HCI datasets under a subject-independent setting. For the DEAP dataset, the average classification accuracies are 72.42% for valence and 73.91% for arousal, while for the MAHNOB-HCI dataset, they are 80.45% and 79.69%, respectively. Compared to other physiological signals, such as EEG, respiration signals exhibit comparable potential for emotion recognition.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 16\",\"pages\":\"31290-31301\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11080225/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/11080225/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Hierarchical Transformer With Auxiliary Learning for Subject-Independent Respiration Emotion Recognition
Respiration is modulated by human emotional activity. Emotion recognition using physiological signals has recently gained considerable attention. However, most existing studies primarily focus on using electroencephalogram (EEG) signals for emotion recognition. This article explores the potential of utilizing respiration signals collected by wearable devices for emotion recognition. We propose a hierarchical transformer model to effectively extract emotional information from respiration signals. Furthermore, we introduce gender classification as an auxiliary task to further improve the accuracy of emotion recognition. Specifically, a frame transformer is employed to capture emotional information across frames of respiration signals. The extracted frame-level features are subsequently fused with segment-level embeddings through a specially designed fusion layer. Next, separate segment transformers are employed for emotion and gender to extract segment-level information, with a cosine similarity loss applied to promote shared feature learning. Finally, distinct classifiers are used for emotion and gender classification. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on the DEAP and MAHNOB-HCI datasets under a subject-independent setting. For the DEAP dataset, the average classification accuracies are 72.42% for valence and 73.91% for arousal, while for the MAHNOB-HCI dataset, they are 80.45% and 79.69%, respectively. Compared to other physiological signals, such as EEG, respiration signals exhibit comparable potential for emotion recognition.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice