{"title":"基于s变换和ISDNet的多模态手部运动识别框架","authors":"Lei Shi;Ranran Gui;Qunfeng Niu;Peng Li","doi":"10.1109/JSEN.2025.3542902","DOIUrl":null,"url":null,"abstract":"In the rehabilitation of hand movement disorders, multimodal signal-based hand movement recognition (HMR) plays a crucial role in enhancing therapeutic interventions and improving patient outcomes. However, existing methods face challenges such as suboptimal feature fusion and limited recognition performance. To address these issues, this article proposes a novel multimodal HMR framework. First, a signal fusion algorithm based on Spearman’s rank correlation coefficient (SRCC) is utilized to effectively integrate features from surface electromyography (sEMG) and triaxial acceleration signals (TASs), laying a solid foundation for subsequent feature fusion. Next, a feature fusion algorithm based on S-transform (S-T) and RGB image technology is developed, transforming signals into 3-D time-frequency fusion feature maps (3D-TFTTMs) to more comprehensively capture the time-frequency characteristics of the signals. Subsequently, a deep learning model, inception-SENet-DenseNet (ISDNet), is designed, incorporating both inception and squeeze-and-excitation network (SENet) modules. The inception module extracts fused features, while SENet dynamically adjusts channel weights, significantly enhancing recognition performance. Evaluation on the Ninapro DB2&3, DB5, and DB7 databases demonstrates that ISDNet achieves HMR accuracies of 97.02%, 93.78%, and 95.37%, respectively, significantly outperforming existing multimodal HMR methods. The results validate the effectiveness of the proposed framework in multimodal fusion and highlight its potential for advancing HMR technology, with broad application prospects in areas such as prosthetics, rehabilitation, and robotics.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 7","pages":"11672-11682"},"PeriodicalIF":4.3000,"publicationDate":"2025-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Multimodal Hand Movement Recognition Framework Based on S-Transform and ISDNet\",\"authors\":\"Lei Shi;Ranran Gui;Qunfeng Niu;Peng Li\",\"doi\":\"10.1109/JSEN.2025.3542902\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the rehabilitation of hand movement disorders, multimodal signal-based hand movement recognition (HMR) plays a crucial role in enhancing therapeutic interventions and improving patient outcomes. However, existing methods face challenges such as suboptimal feature fusion and limited recognition performance. To address these issues, this article proposes a novel multimodal HMR framework. First, a signal fusion algorithm based on Spearman’s rank correlation coefficient (SRCC) is utilized to effectively integrate features from surface electromyography (sEMG) and triaxial acceleration signals (TASs), laying a solid foundation for subsequent feature fusion. Next, a feature fusion algorithm based on S-transform (S-T) and RGB image technology is developed, transforming signals into 3-D time-frequency fusion feature maps (3D-TFTTMs) to more comprehensively capture the time-frequency characteristics of the signals. Subsequently, a deep learning model, inception-SENet-DenseNet (ISDNet), is designed, incorporating both inception and squeeze-and-excitation network (SENet) modules. The inception module extracts fused features, while SENet dynamically adjusts channel weights, significantly enhancing recognition performance. Evaluation on the Ninapro DB2&3, DB5, and DB7 databases demonstrates that ISDNet achieves HMR accuracies of 97.02%, 93.78%, and 95.37%, respectively, significantly outperforming existing multimodal HMR methods. The results validate the effectiveness of the proposed framework in multimodal fusion and highlight its potential for advancing HMR technology, with broad application prospects in areas such as prosthetics, rehabilitation, and robotics.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 7\",\"pages\":\"11672-11682\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-02-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10906063/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10906063/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
A Multimodal Hand Movement Recognition Framework Based on S-Transform and ISDNet
In the rehabilitation of hand movement disorders, multimodal signal-based hand movement recognition (HMR) plays a crucial role in enhancing therapeutic interventions and improving patient outcomes. However, existing methods face challenges such as suboptimal feature fusion and limited recognition performance. To address these issues, this article proposes a novel multimodal HMR framework. First, a signal fusion algorithm based on Spearman’s rank correlation coefficient (SRCC) is utilized to effectively integrate features from surface electromyography (sEMG) and triaxial acceleration signals (TASs), laying a solid foundation for subsequent feature fusion. Next, a feature fusion algorithm based on S-transform (S-T) and RGB image technology is developed, transforming signals into 3-D time-frequency fusion feature maps (3D-TFTTMs) to more comprehensively capture the time-frequency characteristics of the signals. Subsequently, a deep learning model, inception-SENet-DenseNet (ISDNet), is designed, incorporating both inception and squeeze-and-excitation network (SENet) modules. The inception module extracts fused features, while SENet dynamically adjusts channel weights, significantly enhancing recognition performance. Evaluation on the Ninapro DB2&3, DB5, and DB7 databases demonstrates that ISDNet achieves HMR accuracies of 97.02%, 93.78%, and 95.37%, respectively, significantly outperforming existing multimodal HMR methods. The results validate the effectiveness of the proposed framework in multimodal fusion and highlight its potential for advancing HMR technology, with broad application prospects in areas such as prosthetics, rehabilitation, and robotics.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice