Motor imagery decoding using source optimized transfer learning based on multi-loss fusion CNN

IF 3.1 3区 工程技术 Q2 NEUROSCIENCES
Jun Ma, Banghua Yang, Fenqi Rong, Shouwei Gao, Wen Wang
{"title":"Motor imagery decoding using source optimized transfer learning based on multi-loss fusion CNN","authors":"Jun Ma, Banghua Yang, Fenqi Rong, Shouwei Gao, Wen Wang","doi":"10.1007/s11571-024-10100-5","DOIUrl":null,"url":null,"abstract":"<p>Transfer learning is increasingly used to decode multi-class motor imagery tasks. Previous transfer learning ignored the optimizability of the source model, weakened the adaptability to the target domain and limited the performance. This paper first proposes the multi-loss fusion convolutional neural network (MF-CNN) to make an optimizable source model. Then we propose a novel source optimized transfer learning (SOTL), which optimizes the source model to make it more in line with the target domain's features to improve the target model's performance. We transfer the model trained from 16 healthy subjects to 16 stroke patients. The average classification accuracy achieves 51.2 ± 0.17% in the four types of unilateral upper limb motor imagery tasks, which is significantly higher than the classification accuracy of deep learning (<i>p</i> &lt; 0.001) and transfer learning (<i>p</i> &lt; 0.05). In this paper, an MI model from the data of healthy subjects can be used for the classification of stroke patients and can demonstrate good classification results, which provides experiential support for the study of transfer learning and the modeling of stroke rehabilitation training.</p>","PeriodicalId":10500,"journal":{"name":"Cognitive Neurodynamics","volume":"234 1","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2024-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Neurodynamics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11571-024-10100-5","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Transfer learning is increasingly used to decode multi-class motor imagery tasks. Previous transfer learning ignored the optimizability of the source model, weakened the adaptability to the target domain and limited the performance. This paper first proposes the multi-loss fusion convolutional neural network (MF-CNN) to make an optimizable source model. Then we propose a novel source optimized transfer learning (SOTL), which optimizes the source model to make it more in line with the target domain's features to improve the target model's performance. We transfer the model trained from 16 healthy subjects to 16 stroke patients. The average classification accuracy achieves 51.2 ± 0.17% in the four types of unilateral upper limb motor imagery tasks, which is significantly higher than the classification accuracy of deep learning (p < 0.001) and transfer learning (p < 0.05). In this paper, an MI model from the data of healthy subjects can be used for the classification of stroke patients and can demonstrate good classification results, which provides experiential support for the study of transfer learning and the modeling of stroke rehabilitation training.

Abstract Image

利用基于多损失融合 CNN 的源优化迁移学习进行运动图像解码
迁移学习越来越多地用于解码多类运动图像任务。以往的迁移学习忽视了源模型的可优化性,削弱了对目标域的适应性,限制了迁移学习的性能。本文首先提出了多损失融合卷积神经网络(MF-CNN)来建立可优化的源模型。然后,我们提出了一种新颖的源优化迁移学习(SOTL),通过优化源模型使其更符合目标领域的特征,从而提高目标模型的性能。我们将从 16 名健康人身上训练出来的模型移植到 16 名中风患者身上。在四种类型的单侧上肢运动想象任务中,平均分类准确率达到 51.2 ± 0.17%,明显高于深度学习(p <0.001)和迁移学习(p <0.05)的分类准确率。本文将健康受试者数据中的运动学模型用于脑卒中患者的分类,并取得了良好的分类效果,为迁移学习的研究和脑卒中康复训练的建模提供了经验支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Cognitive Neurodynamics
Cognitive Neurodynamics 医学-神经科学
CiteScore
6.90
自引率
18.90%
发文量
140
审稿时长
12 months
期刊介绍: Cognitive Neurodynamics provides a unique forum of communication and cooperation for scientists and engineers working in the field of cognitive neurodynamics, intelligent science and applications, bridging the gap between theory and application, without any preference for pure theoretical, experimental or computational models. The emphasis is to publish original models of cognitive neurodynamics, novel computational theories and experimental results. In particular, intelligent science inspired by cognitive neuroscience and neurodynamics is also very welcome. The scope of Cognitive Neurodynamics covers cognitive neuroscience, neural computation based on dynamics, computer science, intelligent science as well as their interdisciplinary applications in the natural and engineering sciences. Papers that are appropriate for non-specialist readers are encouraged. 1. There is no page limit for manuscripts submitted to Cognitive Neurodynamics. Research papers should clearly represent an important advance of especially broad interest to researchers and technologists in neuroscience, biophysics, BCI, neural computer and intelligent robotics. 2. Cognitive Neurodynamics also welcomes brief communications: short papers reporting results that are of genuinely broad interest but that for one reason and another do not make a sufficiently complete story to justify a full article publication. Brief Communications should consist of approximately four manuscript pages. 3. Cognitive Neurodynamics publishes review articles in which a specific field is reviewed through an exhaustive literature survey. There are no restrictions on the number of pages. Review articles are usually invited, but submitted reviews will also be considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信