Yuting Tang, Neethu Robinson, Xi Fu, Kavitha P Thomas, Aung Aung Phyo Wai, Cuntai Guan
{"title":"Reconstruction of Continuous Hand Grasp Movement from EEG Using Deep Learning.","authors":"Yuting Tang, Neethu Robinson, Xi Fu, Kavitha P Thomas, Aung Aung Phyo Wai, Cuntai Guan","doi":"10.1109/EMBC53108.2024.10781850","DOIUrl":null,"url":null,"abstract":"<p><p>Brain-Computer Interface (BCI) is a promising neu-rotechnology offering non-muscular control of external devices, such as neuroprostheses and robotic exoskeletons. A new yet under-explored BCI control paradigm is Motion Trajectory Prediction (MTP). While MTP provides continuous control signals suitable for high-precision tasks, its feasibility and applications are challenged by the low signal-to-noise ratio, especially in noninvasive settings. Previous research has predominantly focused on kinematic reconstruction of upper (e.g., arm reaching) and lower limbs (e.g., gait). However, finger movements have received much less attention, despite their crucial role in daily activities. To address this gap, our study explores the potential of noninvasive Electroencephalography (EEG) for reconstructing finger movements, specifically during hand grasping actions. A new experimental paradigm to collect multichannel EEG data from 20 healthy subjects, while performing full, natural hand opening and closing movements, was designed. Employing state-of-the-art deep learning algorithms, continuous decoding models were constructed for eight key finger joints. The Convolutional Neural Network with Attention approach achieved an average decoding performance of r=0.63. Furthermore, a post-hoc metric was proposed for hand grasp cycle detection, and 83.5% of hand grasps were successfully detected from the reconstructed motion signals, which can potentially serve as a new BCI command. Explainable AI algorithm was also applied to analyze the topographical relevance of trained features. Our findings demonstrate the feasibility of using EEG to reconstruct hand joint movements and highlight the potential of MTP-BCI in control and rehabilitation applications.</p>","PeriodicalId":72237,"journal":{"name":"Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference","volume":"2024 ","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EMBC53108.2024.10781850","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Brain-Computer Interface (BCI) is a promising neu-rotechnology offering non-muscular control of external devices, such as neuroprostheses and robotic exoskeletons. A new yet under-explored BCI control paradigm is Motion Trajectory Prediction (MTP). While MTP provides continuous control signals suitable for high-precision tasks, its feasibility and applications are challenged by the low signal-to-noise ratio, especially in noninvasive settings. Previous research has predominantly focused on kinematic reconstruction of upper (e.g., arm reaching) and lower limbs (e.g., gait). However, finger movements have received much less attention, despite their crucial role in daily activities. To address this gap, our study explores the potential of noninvasive Electroencephalography (EEG) for reconstructing finger movements, specifically during hand grasping actions. A new experimental paradigm to collect multichannel EEG data from 20 healthy subjects, while performing full, natural hand opening and closing movements, was designed. Employing state-of-the-art deep learning algorithms, continuous decoding models were constructed for eight key finger joints. The Convolutional Neural Network with Attention approach achieved an average decoding performance of r=0.63. Furthermore, a post-hoc metric was proposed for hand grasp cycle detection, and 83.5% of hand grasps were successfully detected from the reconstructed motion signals, which can potentially serve as a new BCI command. Explainable AI algorithm was also applied to analyze the topographical relevance of trained features. Our findings demonstrate the feasibility of using EEG to reconstruct hand joint movements and highlight the potential of MTP-BCI in control and rehabilitation applications.