Joint disentangled representation and domain adversarial training for EEG-based cross-session biometric recognition in single-task protocols.

IF 3.1 3区 工程技术 Q2 NEUROSCIENCES
Cognitive Neurodynamics Pub Date : 2025-12-01 Epub Date: 2025-01-23 DOI:10.1007/s11571-024-10214-w
Honggang Liu, Xuanyu Jin, Dongjun Liu, Wanzeng Kong, Jiajia Tang, Yong Peng
{"title":"Joint disentangled representation and domain adversarial training for EEG-based cross-session biometric recognition in single-task protocols.","authors":"Honggang Liu, Xuanyu Jin, Dongjun Liu, Wanzeng Kong, Jiajia Tang, Yong Peng","doi":"10.1007/s11571-024-10214-w","DOIUrl":null,"url":null,"abstract":"<p><p>The increasing adoption of wearable technologies highlights the potential of electroencephalogram (EEG) signals for biometric recognition. However, the intrinsic variability in cross-session EEG data presents substantial challenges in maintaining model stability and reliability. Moreover, the diversity within single-task protocols complicates achieving consistent and generalized model performance. To address these issues, we propose the Joint Disentangled Representation with Domain Adversarial Training (JDR-DAT) framework for EEG-based cross-session biometric recognition within single-task protocols. The JDR-DAT framework disentangles identity-specific features through mutual information estimation and incorporates domain adversarial training to enhance longitudinal robustness. Extensive experiments on longitudinal EEG data from two publicly available single-task protocol datasets-RSVP-based (Rapid Serial Visual Presentation) and MI-based (Motor Imagery)-demonstrate the efficacy of the JDR-DAT framework, with the proposed method achieving average accuracies of 85.83% and 96.72%, respectively.</p>","PeriodicalId":10500,"journal":{"name":"Cognitive Neurodynamics","volume":"19 1","pages":"31"},"PeriodicalIF":3.1000,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11757832/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Neurodynamics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11571-024-10214-w","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/23 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

The increasing adoption of wearable technologies highlights the potential of electroencephalogram (EEG) signals for biometric recognition. However, the intrinsic variability in cross-session EEG data presents substantial challenges in maintaining model stability and reliability. Moreover, the diversity within single-task protocols complicates achieving consistent and generalized model performance. To address these issues, we propose the Joint Disentangled Representation with Domain Adversarial Training (JDR-DAT) framework for EEG-based cross-session biometric recognition within single-task protocols. The JDR-DAT framework disentangles identity-specific features through mutual information estimation and incorporates domain adversarial training to enhance longitudinal robustness. Extensive experiments on longitudinal EEG data from two publicly available single-task protocol datasets-RSVP-based (Rapid Serial Visual Presentation) and MI-based (Motor Imagery)-demonstrate the efficacy of the JDR-DAT framework, with the proposed method achieving average accuracies of 85.83% and 96.72%, respectively.

单任务协议下基于脑电图的跨会话生物特征识别联合解纠缠表示和领域对抗训练。
可穿戴技术的日益普及凸显了脑电图(EEG)信号在生物识别方面的潜力。然而,跨会话脑电图数据的内在变异性对保持模型的稳定性和可靠性提出了实质性的挑战。此外,单任务协议中的多样性使实现一致和一般化的模型性能变得复杂。为了解决这些问题,我们提出了在单任务协议中基于脑电图的跨会话生物识别的联合解纠缠表示与领域对抗训练(JDR-DAT)框架。JDR-DAT框架通过互信息估计来分离身份特定特征,并结合领域对抗训练来增强纵向鲁棒性。在两个公开的单任务协议数据集——基于rsvp(快速串行视觉呈现)和基于mi(运动图像)——的纵向脑电图数据上进行的大量实验证明了JDR-DAT框架的有效性,所提出的方法的平均准确率分别达到85.83%和96.72%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Cognitive Neurodynamics
Cognitive Neurodynamics 医学-神经科学
CiteScore
6.90
自引率
18.90%
发文量
140
审稿时长
12 months
期刊介绍: Cognitive Neurodynamics provides a unique forum of communication and cooperation for scientists and engineers working in the field of cognitive neurodynamics, intelligent science and applications, bridging the gap between theory and application, without any preference for pure theoretical, experimental or computational models. The emphasis is to publish original models of cognitive neurodynamics, novel computational theories and experimental results. In particular, intelligent science inspired by cognitive neuroscience and neurodynamics is also very welcome. The scope of Cognitive Neurodynamics covers cognitive neuroscience, neural computation based on dynamics, computer science, intelligent science as well as their interdisciplinary applications in the natural and engineering sciences. Papers that are appropriate for non-specialist readers are encouraged. 1. There is no page limit for manuscripts submitted to Cognitive Neurodynamics. Research papers should clearly represent an important advance of especially broad interest to researchers and technologists in neuroscience, biophysics, BCI, neural computer and intelligent robotics. 2. Cognitive Neurodynamics also welcomes brief communications: short papers reporting results that are of genuinely broad interest but that for one reason and another do not make a sufficiently complete story to justify a full article publication. Brief Communications should consist of approximately four manuscript pages. 3. Cognitive Neurodynamics publishes review articles in which a specific field is reviewed through an exhaustive literature survey. There are no restrictions on the number of pages. Review articles are usually invited, but submitted reviews will also be considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信