[基于黎曼空间滤波和域自适应的跨会话运动图像-脑电图解码]。

Q4 Medicine
Lincong Pan, Xinwei Sun, Kun Wang, Yupei Cao, Minpeng Xu, Dong Ming
{"title":"[基于黎曼空间滤波和域自适应的跨会话运动图像-脑电图解码]。","authors":"Lincong Pan, Xinwei Sun, Kun Wang, Yupei Cao, Minpeng Xu, Dong Ming","doi":"10.7507/1001-5515.202411035","DOIUrl":null,"url":null,"abstract":"<p><p>Motor imagery (MI) is a mental process that can be recognized by electroencephalography (EEG) without actual movement. It has significant research value and application potential in the field of brain-computer interface (BCI) technology. To address the challenges posed by the non-stationary nature and low signal-to-noise ratio of MI-EEG signals, this study proposed a Riemannian spatial filtering and domain adaptation (RSFDA) method for improving the accuracy and efficiency of cross-session MI-BCI classification tasks. The approach addressed the issue of inconsistent data distribution between source and target domains through a multi-module collaborative framework, which enhanced the generalization capability of cross-session MI-EEG classification models. Comparative experiments were conducted on three public datasets to evaluate RSFDA against eight existing methods in terms of classification accuracy and computational efficiency. The experimental results demonstrated that RSFDA achieved an average classification accuracy of 79.37%, outperforming the state-of-the-art deep learning method Tensor-CSPNet (76.46%) by 2.91% ( <i>P</i> < 0.01). Furthermore, the proposed method showed significantly lower computational costs, requiring only approximately 3 minutes of average training time compared to Tensor-CSPNet's 25 minutes, representing a reduction of 22 minutes. These findings indicate that the RSFDA method demonstrates superior performance in cross-session MI-EEG classification tasks by effectively balancing accuracy and efficiency. However, its applicability in complex transfer learning scenarios remains to be further investigated.</p>","PeriodicalId":39324,"journal":{"name":"生物医学工程学杂志","volume":"42 2","pages":"272-279"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12035623/pdf/","citationCount":"0","resultStr":"{\"title\":\"[Cross-session motor imagery-electroencephalography decoding with Riemannian spatial filtering and domain adaptation].\",\"authors\":\"Lincong Pan, Xinwei Sun, Kun Wang, Yupei Cao, Minpeng Xu, Dong Ming\",\"doi\":\"10.7507/1001-5515.202411035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Motor imagery (MI) is a mental process that can be recognized by electroencephalography (EEG) without actual movement. It has significant research value and application potential in the field of brain-computer interface (BCI) technology. To address the challenges posed by the non-stationary nature and low signal-to-noise ratio of MI-EEG signals, this study proposed a Riemannian spatial filtering and domain adaptation (RSFDA) method for improving the accuracy and efficiency of cross-session MI-BCI classification tasks. The approach addressed the issue of inconsistent data distribution between source and target domains through a multi-module collaborative framework, which enhanced the generalization capability of cross-session MI-EEG classification models. Comparative experiments were conducted on three public datasets to evaluate RSFDA against eight existing methods in terms of classification accuracy and computational efficiency. The experimental results demonstrated that RSFDA achieved an average classification accuracy of 79.37%, outperforming the state-of-the-art deep learning method Tensor-CSPNet (76.46%) by 2.91% ( <i>P</i> < 0.01). Furthermore, the proposed method showed significantly lower computational costs, requiring only approximately 3 minutes of average training time compared to Tensor-CSPNet's 25 minutes, representing a reduction of 22 minutes. These findings indicate that the RSFDA method demonstrates superior performance in cross-session MI-EEG classification tasks by effectively balancing accuracy and efficiency. However, its applicability in complex transfer learning scenarios remains to be further investigated.</p>\",\"PeriodicalId\":39324,\"journal\":{\"name\":\"生物医学工程学杂志\",\"volume\":\"42 2\",\"pages\":\"272-279\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12035623/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"生物医学工程学杂志\",\"FirstCategoryId\":\"1087\",\"ListUrlMain\":\"https://doi.org/10.7507/1001-5515.202411035\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"Medicine\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"生物医学工程学杂志","FirstCategoryId":"1087","ListUrlMain":"https://doi.org/10.7507/1001-5515.202411035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0

摘要

运动意象(MI)是一种无需实际运动即可通过脑电图(EEG)识别的心理过程。在脑机接口(BCI)技术领域具有重要的研究价值和应用潜力。针对MI-EEG信号的非平稳性和低信噪比带来的挑战,本研究提出了一种Riemannian spatial filtering and domain adaptive (RSFDA)方法来提高MI-BCI跨会话分类任务的准确率和效率。该方法通过多模块协作框架解决了源域和目标域数据分布不一致的问题,增强了跨会话MI-EEG分类模型的泛化能力。在3个公开的数据集上进行对比实验,比较RSFDA与现有的8种方法在分类精度和计算效率方面的差异。实验结果表明,RSFDA的平均分类准确率为79.37%,比最先进的深度学习方法Tensor-CSPNet(76.46%)高2.91% (P < 0.01)。此外,该方法的计算成本显著降低,与Tensor-CSPNet的25分钟训练时间相比,平均仅需约3分钟的训练时间,减少了22分钟。上述结果表明,RSFDA方法能够有效地平衡准确率和效率,在跨会话MI-EEG分类任务中表现出优越的性能。然而,其在复杂迁移学习场景中的适用性仍有待进一步研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
[Cross-session motor imagery-electroencephalography decoding with Riemannian spatial filtering and domain adaptation].

Motor imagery (MI) is a mental process that can be recognized by electroencephalography (EEG) without actual movement. It has significant research value and application potential in the field of brain-computer interface (BCI) technology. To address the challenges posed by the non-stationary nature and low signal-to-noise ratio of MI-EEG signals, this study proposed a Riemannian spatial filtering and domain adaptation (RSFDA) method for improving the accuracy and efficiency of cross-session MI-BCI classification tasks. The approach addressed the issue of inconsistent data distribution between source and target domains through a multi-module collaborative framework, which enhanced the generalization capability of cross-session MI-EEG classification models. Comparative experiments were conducted on three public datasets to evaluate RSFDA against eight existing methods in terms of classification accuracy and computational efficiency. The experimental results demonstrated that RSFDA achieved an average classification accuracy of 79.37%, outperforming the state-of-the-art deep learning method Tensor-CSPNet (76.46%) by 2.91% ( P < 0.01). Furthermore, the proposed method showed significantly lower computational costs, requiring only approximately 3 minutes of average training time compared to Tensor-CSPNet's 25 minutes, representing a reduction of 22 minutes. These findings indicate that the RSFDA method demonstrates superior performance in cross-session MI-EEG classification tasks by effectively balancing accuracy and efficiency. However, its applicability in complex transfer learning scenarios remains to be further investigated.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
生物医学工程学杂志
生物医学工程学杂志 Medicine-Medicine (all)
CiteScore
0.80
自引率
0.00%
发文量
4868
期刊介绍:
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信