E2FNet: An EEG- and EMG-Based Fusion Network for Hand Motion Intention Recognition

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Guoqian Jiang;Kunyu Wang;Qun He;Ping Xie
{"title":"E2FNet: An EEG- and EMG-Based Fusion Network for Hand Motion Intention Recognition","authors":"Guoqian Jiang;Kunyu Wang;Qun He;Ping Xie","doi":"10.1109/JSEN.2024.3471894","DOIUrl":null,"url":null,"abstract":"In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 22","pages":"38417-38428"},"PeriodicalIF":4.3000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10706790/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.
E2FNet:基于脑电图和肌电图的手部运动意图识别融合网络
随着肢体障碍患者的不断增加,人们越来越需要解决他们在日常生活中面临的挑战。现有的康复技术通常依赖于单一的生理信号,而且信号质量较差,因此在有效性方面存在局限性。为了克服这些限制,我们推出了 E2FNet,这是一种多模态生理信息融合网络,专为肢体障碍患者的运动意图识别而设计。这项研究涉及八名健康参与者,他们在各种手部运动过程中记录了肌电图(EMG)和脑电图(EEG)信号。E2FNet 利用多尺度卷积神经网络从 EEG 和 EMG 数据中提取特征,重点关注不同尺度的信息融合。我们还引入了交叉注意机制,以捕捉跨模态信息交互,从而加强脑电图和肌电图的信息融合。通过大量实验,E2FNet 的分类准确率达到了令人印象深刻的 92.08%,而且每个模块的有效性都得到了验证。多尺度可分离卷积和交叉注意显著改善了脑电图和肌电信号融合,提高了运动意图识别的准确性和鲁棒性。这项研究有望提高运动障碍患者的生活质量和独立性,同时推动康复机器人和辅助技术领域的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信