SMANet: A Model Combining SincNet, Multi-Branch Spatial—Temporal CNN, and Attention Mechanism for Motor Imagery BCI

IF 4.8 2区 医学 Q2 ENGINEERING, BIOMEDICAL
Danjie Wang;Qingguo Wei
{"title":"SMANet: A Model Combining SincNet, Multi-Branch Spatial—Temporal CNN, and Attention Mechanism for Motor Imagery BCI","authors":"Danjie Wang;Qingguo Wei","doi":"10.1109/TNSRE.2025.3560993","DOIUrl":null,"url":null,"abstract":"Building a brain-computer interface (BCI) based on motor imagery (MI) requires accurately decoding MI tasks, which poses a significant challenge due to individual discrepancy among subjects and low signal-to-noise ratio of EEG signals. We propose an end-to-end deep learning model, Sinc-multibranch-attention network (SMANet), which combines a SincNet, a multibranch spatial-temporal convolutional neural network (MBSTCNN), and an attention mechanism for MI-BCI classification. Firstly, Sinc convolution is utilized as a band-pass filter bank for data filtering; Second, pointwise convolution facilitates the effective integration of feature information across different frequency ranges, thereby enhancing the overall feature expression capability; Next, the resulting data are fed into the MBSTCNN to learn a deep feature representation. Thereafter, the outputs of the MBSTCNN are concatenated and then passed through an efficient channel attention (ECA) module to enhance local channel feature extraction and calibrate feature mapping. Ultimately, the feature maps yielded by ECA are classified using a fully connected layer. This model SMANet enhances discriminative features through a multi-objective optimization scheme that integrates cross-entropy loss and central loss. The experimental outcomes reveal that our model attains an average accuracy of 80.21% on the four-class MI dataset (BCI Competition IV 2a), 84.02% on the two-class MI dataset (BCI Competition IV 2b), and 72.70% on the two-class MI dataset (OpenBMI). These results are superior to those of the current state-of-the-art methods. The SMANet is capable to effectively decoding the spatial-spectral-temporal information of EEG signals, thereby enhancing the performance of MI-BCIs.","PeriodicalId":13419,"journal":{"name":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","volume":"33 ","pages":"1497-1508"},"PeriodicalIF":4.8000,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10965876","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10965876/","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Building a brain-computer interface (BCI) based on motor imagery (MI) requires accurately decoding MI tasks, which poses a significant challenge due to individual discrepancy among subjects and low signal-to-noise ratio of EEG signals. We propose an end-to-end deep learning model, Sinc-multibranch-attention network (SMANet), which combines a SincNet, a multibranch spatial-temporal convolutional neural network (MBSTCNN), and an attention mechanism for MI-BCI classification. Firstly, Sinc convolution is utilized as a band-pass filter bank for data filtering; Second, pointwise convolution facilitates the effective integration of feature information across different frequency ranges, thereby enhancing the overall feature expression capability; Next, the resulting data are fed into the MBSTCNN to learn a deep feature representation. Thereafter, the outputs of the MBSTCNN are concatenated and then passed through an efficient channel attention (ECA) module to enhance local channel feature extraction and calibrate feature mapping. Ultimately, the feature maps yielded by ECA are classified using a fully connected layer. This model SMANet enhances discriminative features through a multi-objective optimization scheme that integrates cross-entropy loss and central loss. The experimental outcomes reveal that our model attains an average accuracy of 80.21% on the four-class MI dataset (BCI Competition IV 2a), 84.02% on the two-class MI dataset (BCI Competition IV 2b), and 72.70% on the two-class MI dataset (OpenBMI). These results are superior to those of the current state-of-the-art methods. The SMANet is capable to effectively decoding the spatial-spectral-temporal information of EEG signals, thereby enhancing the performance of MI-BCIs.
SMANet:一种结合SincNet、多分支时空CNN和注意机制的运动意象脑机接口模型
基于运动图像(MI)的脑机接口(BCI)的构建需要对MI任务进行准确解码,由于被试个体差异和脑电信号的低信噪比,这给脑机接口的构建带来了很大的挑战。本文提出了一种端到端深度学习模型——SincNet -多分支注意力网络(SMANet),该模型结合了SincNet、多分支时空卷积神经网络(MBSTCNN)和用于MI-BCI分类的注意力机制。首先,利用Sinc卷积作为带通滤波器组进行数据滤波;其次,点向卷积有利于有效整合不同频率范围的特征信息,从而增强整体特征表达能力;接下来,将得到的数据输入到MBSTCNN中学习深度特征表示。然后,将MBSTCNN的输出进行串联,然后通过一个高效的通道注意(ECA)模块来增强局部通道特征提取和校准特征映射。最后,使用全连接层对ECA生成的特征映射进行分类。该模型通过交叉熵损失和中心损失相结合的多目标优化方案增强了SMANet的判别特征。实验结果表明,该模型在四类MI数据集(BCI Competition IV 2a)上的平均准确率为80.21%,在两类MI数据集(BCI Competition IV 2b)上的平均准确率为84.02%,在两类MI数据集(OpenBMI)上的平均准确率为72.70%。这些结果优于目前最先进的方法。SMANet能够有效地解码脑电信号的空间-频谱-时间信息,从而提高mi - bci的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.60
自引率
8.20%
发文量
479
审稿时长
6-12 weeks
期刊介绍: Rehabilitative and neural aspects of biomedical engineering, including functional electrical stimulation, acoustic dynamics, human performance measurement and analysis, nerve stimulation, electromyography, motor control and stimulation; and hardware and software applications for rehabilitation engineering and assistive devices.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信