BGTransform:一个基于神经生理学的EEG数据增强框架。

IF 3.8
Jin Yue, Xiaolin Xiao, Hao Zhang, Minpeng Xu, Dong Ming
{"title":"BGTransform:一个基于神经生理学的EEG数据增强框架。","authors":"Jin Yue, Xiaolin Xiao, Hao Zhang, Minpeng Xu, Dong Ming","doi":"10.1088/1741-2552/ae0c3a","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>Deep learning has emerged as a powerful approach for decoding electroencephalography (EEG)-based brain-computer interface (BCI) signals. However, its effectiveness is often limited by the scarcity and variability of available training data. Existing data augmentation methods often introduce signal distortions or lack physiological validity. This study proposes a novel augmentation strategy designed to improve generalization while preserving the underlying neurophysiological structure of EEG signals.&#xD;Approach. We propose Background EEG Transform (BGTransform), a principled data augmentation framework that leverages the neurophysiological dissociation between task-related activity and ongoing background EEG. In contrast to existing methods, BGTransform generates new trials by selectively perturbing the background EEG component while preserving the task-related signal, thus enabling controlled variability without compromising class-discriminative features. We applied BGTransform to three publicly available EEG-BCI datasets spanning steady-state visual evoked potential (SSVEP) and P300 paradigms. The effectiveness of BGTransform is evaluated using several widely adopted neural decoding models under three training regimes: (1) without augmentation (baseline model), (2) with conventional augmentation methods, and (3) with BGTransform. &#xD;Main Results. Across all datasets and model architectures, BGTransform consistently outperformed both baseline models and conventional augmentation techniques. Compared to models trained without BGTransform, it achieved average classification accuracy improvements of 2.45\\%-15.52\\%, 4.36-17.15\\%, and 7.55-10.47\\% across the three datasets, respectively. In addition, BGTransform demonstrated greater robustness across subjects and tasks, maintaining stable performance under varying recording conditions. &#xD;Significance. BGTransform provides a principled and effective approach to augmenting EEG data, informed by neurophysiological insight. By preserving task-related components and introducing controlled variability, the method addresses the challenge of data sparsity in EEG-BCI training. These findings support the utility of BGTransform for improving the accuracy, robustness, and generalizability of deep learning models in neural engineering applications.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":3.8000,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"BGTransform: a neurophysiologically informed EEG data augmentation framework.\",\"authors\":\"Jin Yue, Xiaolin Xiao, Hao Zhang, Minpeng Xu, Dong Ming\",\"doi\":\"10.1088/1741-2552/ae0c3a\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objective: </strong>Deep learning has emerged as a powerful approach for decoding electroencephalography (EEG)-based brain-computer interface (BCI) signals. However, its effectiveness is often limited by the scarcity and variability of available training data. Existing data augmentation methods often introduce signal distortions or lack physiological validity. This study proposes a novel augmentation strategy designed to improve generalization while preserving the underlying neurophysiological structure of EEG signals.&#xD;Approach. We propose Background EEG Transform (BGTransform), a principled data augmentation framework that leverages the neurophysiological dissociation between task-related activity and ongoing background EEG. In contrast to existing methods, BGTransform generates new trials by selectively perturbing the background EEG component while preserving the task-related signal, thus enabling controlled variability without compromising class-discriminative features. We applied BGTransform to three publicly available EEG-BCI datasets spanning steady-state visual evoked potential (SSVEP) and P300 paradigms. The effectiveness of BGTransform is evaluated using several widely adopted neural decoding models under three training regimes: (1) without augmentation (baseline model), (2) with conventional augmentation methods, and (3) with BGTransform. &#xD;Main Results. Across all datasets and model architectures, BGTransform consistently outperformed both baseline models and conventional augmentation techniques. Compared to models trained without BGTransform, it achieved average classification accuracy improvements of 2.45\\\\%-15.52\\\\%, 4.36-17.15\\\\%, and 7.55-10.47\\\\% across the three datasets, respectively. In addition, BGTransform demonstrated greater robustness across subjects and tasks, maintaining stable performance under varying recording conditions. &#xD;Significance. BGTransform provides a principled and effective approach to augmenting EEG data, informed by neurophysiological insight. By preserving task-related components and introducing controlled variability, the method addresses the challenge of data sparsity in EEG-BCI training. These findings support the utility of BGTransform for improving the accuracy, robustness, and generalizability of deep learning models in neural engineering applications.</p>\",\"PeriodicalId\":94096,\"journal\":{\"name\":\"Journal of neural engineering\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-09-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of neural engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1088/1741-2552/ae0c3a\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ae0c3a","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

目的:深度学习已成为一种解码基于脑电图(EEG)的脑机接口(BCI)信号的有效方法。然而,其有效性往往受到可用训练数据的稀缺性和可变性的限制。现有的数据增强方法往往会引入信号失真或缺乏生理有效性。本研究提出了一种新的增强策略,旨在提高泛化,同时保留脑电图信号的潜在神经生理结构。我们提出背景脑电图变换(BGTransform),这是一个原则性的数据增强框架,利用任务相关活动和正在进行的背景脑电图之间的神经生理分离。与现有方法相比,BGTransform在保留任务相关信号的同时,选择性地干扰背景脑电分量,从而在不影响类别区分特征的情况下实现可变性控制,从而产生新的试验。我们将BGTransform应用于三个公开可用的EEG-BCI数据集,涵盖稳态视觉诱发电位(SSVEP)和P300范式。在三种训练模式下,使用几种广泛采用的神经解码模型来评估BGTransform的有效性:(1)无增强(基线模型),(2)使用常规增强方法,(3)使用BGTransform。& # xD;主要结果。在所有数据集和模型架构中,BGTransform始终优于基线模型和传统增强技术。与未经BGTransform训练的模型相比,它在三个数据集上的平均分类准确率分别提高了2.45% - 15.52%、4.36- 17.15%和7.55- 10.47%。此外,BGTransform在主题和任务之间表现出更强的鲁棒性,在不同的记录条件下保持稳定的性能。& # xD;意义。BGTransform提供了一种原则性和有效的方法来增强脑电图数据,并根据神经生理学的见解提供信息。该方法通过保留任务相关成分和引入可控可变性,解决了脑电图脑机接口训练中数据稀疏性的挑战。这些发现支持了BGTransform在神经工程应用中提高深度学习模型的准确性、鲁棒性和泛化性的效用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
BGTransform: a neurophysiologically informed EEG data augmentation framework.

Objective: Deep learning has emerged as a powerful approach for decoding electroencephalography (EEG)-based brain-computer interface (BCI) signals. However, its effectiveness is often limited by the scarcity and variability of available training data. Existing data augmentation methods often introduce signal distortions or lack physiological validity. This study proposes a novel augmentation strategy designed to improve generalization while preserving the underlying neurophysiological structure of EEG signals. Approach. We propose Background EEG Transform (BGTransform), a principled data augmentation framework that leverages the neurophysiological dissociation between task-related activity and ongoing background EEG. In contrast to existing methods, BGTransform generates new trials by selectively perturbing the background EEG component while preserving the task-related signal, thus enabling controlled variability without compromising class-discriminative features. We applied BGTransform to three publicly available EEG-BCI datasets spanning steady-state visual evoked potential (SSVEP) and P300 paradigms. The effectiveness of BGTransform is evaluated using several widely adopted neural decoding models under three training regimes: (1) without augmentation (baseline model), (2) with conventional augmentation methods, and (3) with BGTransform. Main Results. Across all datasets and model architectures, BGTransform consistently outperformed both baseline models and conventional augmentation techniques. Compared to models trained without BGTransform, it achieved average classification accuracy improvements of 2.45\%-15.52\%, 4.36-17.15\%, and 7.55-10.47\% across the three datasets, respectively. In addition, BGTransform demonstrated greater robustness across subjects and tasks, maintaining stable performance under varying recording conditions. Significance. BGTransform provides a principled and effective approach to augmenting EEG data, informed by neurophysiological insight. By preserving task-related components and introducing controlled variability, the method addresses the challenge of data sparsity in EEG-BCI training. These findings support the utility of BGTransform for improving the accuracy, robustness, and generalizability of deep learning models in neural engineering applications.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信