采用自定义损失函数的时态注意力融合网络,用于脑电图-近红外成像分类。

Chayut Bunterngchit, Jiaxing Wang, Jianqiang Su, Yihan Wang, Shiqi Liu, Zeng-Guang Hou
{"title":"采用自定义损失函数的时态注意力融合网络,用于脑电图-近红外成像分类。","authors":"Chayut Bunterngchit, Jiaxing Wang, Jianqiang Su, Yihan Wang, Shiqi Liu, Zeng-Guang Hou","doi":"10.1088/1741-2552/ad8e86","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>Methods that can detect brain activities accurately are crucial owing to the increasing prevalence of neurological disorders. In this context, a combination of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offers a powerful approach to understanding normal and pathological brain functions, thereby overcoming the limitations of each modality, such as susceptibility to artifacts of EEG and limited temporal resolution of fNIRS. However, challenges such as class imbalance and inter-class variability within multisubject data hinder their full potential.<i>Approach.</i>To address this issue, we propose a novel temporal attention fusion network (TAFN) with a custom loss function. The TAFN model incorporates attention mechanisms to its long short-term memory and temporal convolutional layers to accurately capture spatial and temporal dependencies in the EEG-fNIRS data. The custom loss function combines class weights and asymmetric loss terms to ensure the precise classification of cognitive and motor intentions, along with addressing class imbalance issues.<i>Main results.</i>Rigorous testing demonstrated the exceptional cross-subject accuracy of the TAFN, exceeding 99% for cognitive tasks and 97% for motor imagery (MI) tasks. Additionally, the ability of the model to detect subtle differences in epilepsy was analyzed using scalp topography in MI tasks.<i>Significance.</i>This study presents a technique that outperforms traditional methods for detecting high-precision brain activity with subtle differences in the associated patterns. This makes it a promising tool for applications such as epilepsy and seizure detection, in which discerning subtle pattern differences is of paramount importance.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Temporal attention fusion network with custom loss function for EEG-fNIRS classification.\",\"authors\":\"Chayut Bunterngchit, Jiaxing Wang, Jianqiang Su, Yihan Wang, Shiqi Liu, Zeng-Guang Hou\",\"doi\":\"10.1088/1741-2552/ad8e86\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p><i>Objective.</i>Methods that can detect brain activities accurately are crucial owing to the increasing prevalence of neurological disorders. In this context, a combination of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offers a powerful approach to understanding normal and pathological brain functions, thereby overcoming the limitations of each modality, such as susceptibility to artifacts of EEG and limited temporal resolution of fNIRS. However, challenges such as class imbalance and inter-class variability within multisubject data hinder their full potential.<i>Approach.</i>To address this issue, we propose a novel temporal attention fusion network (TAFN) with a custom loss function. The TAFN model incorporates attention mechanisms to its long short-term memory and temporal convolutional layers to accurately capture spatial and temporal dependencies in the EEG-fNIRS data. The custom loss function combines class weights and asymmetric loss terms to ensure the precise classification of cognitive and motor intentions, along with addressing class imbalance issues.<i>Main results.</i>Rigorous testing demonstrated the exceptional cross-subject accuracy of the TAFN, exceeding 99% for cognitive tasks and 97% for motor imagery (MI) tasks. Additionally, the ability of the model to detect subtle differences in epilepsy was analyzed using scalp topography in MI tasks.<i>Significance.</i>This study presents a technique that outperforms traditional methods for detecting high-precision brain activity with subtle differences in the associated patterns. This makes it a promising tool for applications such as epilepsy and seizure detection, in which discerning subtle pattern differences is of paramount importance.</p>\",\"PeriodicalId\":94096,\"journal\":{\"name\":\"Journal of neural engineering\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-11-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of neural engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1088/1741-2552/ad8e86\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad8e86","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

目的:由于神经系统疾病的发病率越来越高,能够准确检测大脑活动的方法至关重要。在这种情况下,脑电图(EEG)和功能性近红外光谱(fNIRS)的结合为了解正常和病理大脑功能提供了一种强有力的方法,从而克服了每种模式的局限性,如脑电图易受伪影影响和 fNIRS 的时间分辨率有限。为解决这一问题,我们提出了一种具有自定义损失函数的新型时空注意力融合网络(TAFN)。TAFN 模型将注意力机制纳入其长短期记忆和时间卷积层,以准确捕捉 EEG-fNIRS 数据中的空间和时间依赖性。自定义损失函数结合了类权重和非对称损失项,以确保认知意图和运动意图的精确分类,同时解决类不平衡问题。主要结果严格的测试表明,TAFN 的跨受试者准确率非常高,认知任务超过 99%,运动想象(MI)任务超过 97%。这项研究提出的技术在检测相关模式中存在细微差别的高精度大脑活动方面优于传统方法。这使得该技术成为癫痫和癫痫发作检测等应用领域的一种前景广阔的工具,在这些应用领域中,辨别细微的模式差异至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Temporal attention fusion network with custom loss function for EEG-fNIRS classification.

Objective.Methods that can detect brain activities accurately are crucial owing to the increasing prevalence of neurological disorders. In this context, a combination of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offers a powerful approach to understanding normal and pathological brain functions, thereby overcoming the limitations of each modality, such as susceptibility to artifacts of EEG and limited temporal resolution of fNIRS. However, challenges such as class imbalance and inter-class variability within multisubject data hinder their full potential.Approach.To address this issue, we propose a novel temporal attention fusion network (TAFN) with a custom loss function. The TAFN model incorporates attention mechanisms to its long short-term memory and temporal convolutional layers to accurately capture spatial and temporal dependencies in the EEG-fNIRS data. The custom loss function combines class weights and asymmetric loss terms to ensure the precise classification of cognitive and motor intentions, along with addressing class imbalance issues.Main results.Rigorous testing demonstrated the exceptional cross-subject accuracy of the TAFN, exceeding 99% for cognitive tasks and 97% for motor imagery (MI) tasks. Additionally, the ability of the model to detect subtle differences in epilepsy was analyzed using scalp topography in MI tasks.Significance.This study presents a technique that outperforms traditional methods for detecting high-precision brain activity with subtle differences in the associated patterns. This makes it a promising tool for applications such as epilepsy and seizure detection, in which discerning subtle pattern differences is of paramount importance.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信