Integrating spatial and temporal features for enhanced artifact removal in multi-channel EEG recordings.

IF 3.7 3区 医学 Q2 ENGINEERING, BIOMEDICAL
Jin Yin,Aiping Liu,LanLan Wang,Ruobing Qian,Xun Chen
{"title":"Integrating spatial and temporal features for enhanced artifact removal in multi-channel EEG recordings.","authors":"Jin Yin,Aiping Liu,LanLan Wang,Ruobing Qian,Xun Chen","doi":"10.1088/1741-2552/ad788d","DOIUrl":null,"url":null,"abstract":"OBJECTIVE\r\nVarious artifacts in electroencephalography (EEG) are a big hurdle to prevent brain-computer interfaces from real-life usage. Recently, deep learning-based EEG denoising methods have shown excellent performance. However, existing deep network designs inadequately leverage inter-channel relationships in processing multichannel EEG signals. Typically, most methods process multi-channel signals in a channel-by-channel way. Considering the correlations among EEG channels during the same brain activity, this paper proposes utilizing channel relationships to enhance denoising performance.\r\n\r\nAPPROACH\r\nWe explicitly model the inter-channel relationships using the self attention mechanism, hypothesizing that these correlations can support and improve the denoising process. Specifically, we introduce a novel denoising network, named Spatial-Temporal Fusion Network (STFNet), which integrates stacked multi-dimension feature extractor to explicitly capture both temporal dependencies and spatial relationships.\r\n\r\nMAIN RESULTS\r\nThe proposed network exhibits superior denoising performance, with a 24.27% reduction in relative root mean squared error compared to other methods on a public benchmark. STFNet proves effective in cross-dataset denoising and downstream classification tasks, improving accuracy by 1.40%, while also offering fast processing on CPU.\r\n\r\nSIGNIFICANCE\r\nThe experimental results demonstrate the importance of integrating spatial and temporal characteristics. The computational efficiency of STFNet makes it suitable for real-time applications and a potential tool for deployment in realistic environments.","PeriodicalId":16753,"journal":{"name":"Journal of neural engineering","volume":"23 1","pages":""},"PeriodicalIF":3.7000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad788d","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

OBJECTIVE Various artifacts in electroencephalography (EEG) are a big hurdle to prevent brain-computer interfaces from real-life usage. Recently, deep learning-based EEG denoising methods have shown excellent performance. However, existing deep network designs inadequately leverage inter-channel relationships in processing multichannel EEG signals. Typically, most methods process multi-channel signals in a channel-by-channel way. Considering the correlations among EEG channels during the same brain activity, this paper proposes utilizing channel relationships to enhance denoising performance. APPROACH We explicitly model the inter-channel relationships using the self attention mechanism, hypothesizing that these correlations can support and improve the denoising process. Specifically, we introduce a novel denoising network, named Spatial-Temporal Fusion Network (STFNet), which integrates stacked multi-dimension feature extractor to explicitly capture both temporal dependencies and spatial relationships. MAIN RESULTS The proposed network exhibits superior denoising performance, with a 24.27% reduction in relative root mean squared error compared to other methods on a public benchmark. STFNet proves effective in cross-dataset denoising and downstream classification tasks, improving accuracy by 1.40%, while also offering fast processing on CPU. SIGNIFICANCE The experimental results demonstrate the importance of integrating spatial and temporal characteristics. The computational efficiency of STFNet makes it suitable for real-time applications and a potential tool for deployment in realistic environments.
整合空间和时间特征,增强多通道脑电图记录中的伪影去除。
目标脑电图(EEG)中的各种伪影是阻碍脑机接口在现实生活中使用的一大障碍。最近,基于深度学习的脑电图去噪方法表现出了卓越的性能。然而,现有的深度网络设计在处理多通道 EEG 信号时未能充分利用通道间的关系。通常情况下,大多数方法都是以逐个通道的方式处理多通道信号。考虑到同一大脑活动中 EEG 通道间的相关性,本文提出利用通道间关系来提高去噪性能。方法我们利用自我注意机制对通道间关系进行明确建模,假设这些相关性可以支持和改善去噪过程。具体来说,我们引入了一种名为 "时空融合网络(STFNet)"的新型去噪网络,该网络集成了堆叠式多维特征提取器,可明确捕捉时间依赖性和空间关系。主要结果所提出的网络具有卓越的去噪性能,在公共基准测试中,与其他方法相比,相对均方根误差降低了 24.27%。STFNet 在跨数据集去噪和下游分类任务中证明是有效的,准确率提高了 1.40%,同时还能在 CPU 上快速处理。STFNet 的计算效率使其适用于实时应用,并成为在现实环境中部署的潜在工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of neural engineering
Journal of neural engineering 工程技术-工程:生物医学
CiteScore
7.80
自引率
12.50%
发文量
319
审稿时长
4.2 months
期刊介绍: The goal of Journal of Neural Engineering (JNE) is to act as a forum for the interdisciplinary field of neural engineering where neuroscientists, neurobiologists and engineers can publish their work in one periodical that bridges the gap between neuroscience and engineering. The journal publishes articles in the field of neural engineering at the molecular, cellular and systems levels. The scope of the journal encompasses experimental, computational, theoretical, clinical and applied aspects of: Innovative neurotechnology; Brain-machine (computer) interface; Neural interfacing; Bioelectronic medicines; Neuromodulation; Neural prostheses; Neural control; Neuro-rehabilitation; Neurorobotics; Optical neural engineering; Neural circuits: artificial & biological; Neuromorphic engineering; Neural tissue regeneration; Neural signal processing; Theoretical and computational neuroscience; Systems neuroscience; Translational neuroscience; Neuroimaging.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信