Motor imagery EEG decoding based on TS-former for spinal cord injury patients

IF 3.5 3区 医学 Q2 NEUROSCIENCES
Fangzhou Xu , Yitai Lou , Yunqing Deng , Zhixiao Lun , Pengcheng Zhao , Di Yan , Zhe Han , Zhirui Wu , Chao Feng , Lei Chen , Jiancai Leng
{"title":"Motor imagery EEG decoding based on TS-former for spinal cord injury patients","authors":"Fangzhou Xu ,&nbsp;Yitai Lou ,&nbsp;Yunqing Deng ,&nbsp;Zhixiao Lun ,&nbsp;Pengcheng Zhao ,&nbsp;Di Yan ,&nbsp;Zhe Han ,&nbsp;Zhirui Wu ,&nbsp;Chao Feng ,&nbsp;Lei Chen ,&nbsp;Jiancai Leng","doi":"10.1016/j.brainresbull.2025.111298","DOIUrl":null,"url":null,"abstract":"<div><div>Traditional machine learning methods struggle with efficiency when processing large-scale data, while deep learning approaches, such as convolutional neural networks (CNN) and long short-term memory networks (LSTM), exhibit certain limitations when handling long-duration sequences. The choice of convolutional kernel size needs to be determined after several experiments, and LSTM has difficulty capturing effective information from long-time sequences. In this paper, we propose a transfer learning (TL) method based on Transformer, which constructs a new network architecture for feature extraction and classification of electroencephalogram (EEG) signals in the time-space domain, named TS-former. The frequency and spatial domain information of EEG signals is extracted using the Filter Bank Common Spatial Pattern (FBCSP), and the resulting features are subsequently processed by the Transformer to capture temporal patterns. The input features are processed by the Transformer using a multi-head attention mechanism, and the final classification outputs are generated through a fully connected layer. A classification model is pre-trained using fine-tuning techniques. When performing a new classification task, only some layers of the model are modified to adapt it to the new data and achieve good classification results. The experiments are conducted on a motor imagery (MI) EEG dataset from 16 spinal cord injury (SCI) patients. After training the model using a ten-time ten-fold cross-validation method, the average classification accuracy reached 95.09 %. Our experimental results confirm a new approach to build a brain-computer interface (BCI) system for rehabilitation training of SCI patients.</div></div>","PeriodicalId":9302,"journal":{"name":"Brain Research Bulletin","volume":"224 ","pages":"Article 111298"},"PeriodicalIF":3.5000,"publicationDate":"2025-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brain Research Bulletin","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0361923025001108","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Traditional machine learning methods struggle with efficiency when processing large-scale data, while deep learning approaches, such as convolutional neural networks (CNN) and long short-term memory networks (LSTM), exhibit certain limitations when handling long-duration sequences. The choice of convolutional kernel size needs to be determined after several experiments, and LSTM has difficulty capturing effective information from long-time sequences. In this paper, we propose a transfer learning (TL) method based on Transformer, which constructs a new network architecture for feature extraction and classification of electroencephalogram (EEG) signals in the time-space domain, named TS-former. The frequency and spatial domain information of EEG signals is extracted using the Filter Bank Common Spatial Pattern (FBCSP), and the resulting features are subsequently processed by the Transformer to capture temporal patterns. The input features are processed by the Transformer using a multi-head attention mechanism, and the final classification outputs are generated through a fully connected layer. A classification model is pre-trained using fine-tuning techniques. When performing a new classification task, only some layers of the model are modified to adapt it to the new data and achieve good classification results. The experiments are conducted on a motor imagery (MI) EEG dataset from 16 spinal cord injury (SCI) patients. After training the model using a ten-time ten-fold cross-validation method, the average classification accuracy reached 95.09 %. Our experimental results confirm a new approach to build a brain-computer interface (BCI) system for rehabilitation training of SCI patients.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Brain Research Bulletin
Brain Research Bulletin 医学-神经科学
CiteScore
6.90
自引率
2.60%
发文量
253
审稿时长
67 days
期刊介绍: The Brain Research Bulletin (BRB) aims to publish novel work that advances our knowledge of molecular and cellular mechanisms that underlie neural network properties associated with behavior, cognition and other brain functions during neurodevelopment and in the adult. Although clinical research is out of the Journal''s scope, the BRB also aims to publish translation research that provides insight into biological mechanisms and processes associated with neurodegeneration mechanisms, neurological diseases and neuropsychiatric disorders. The Journal is especially interested in research using novel methodologies, such as optogenetics, multielectrode array recordings and life imaging in wild-type and genetically-modified animal models, with the goal to advance our understanding of how neurons, glia and networks function in vivo.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信