{"title":"TEDNet: Cascaded CNN-transformer with dual attentions for taste EEG decoding","authors":"Xueli Wang, Guoce Lv","doi":"10.1016/j.jneumeth.2025.110594","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Traditional taste evaluation methods suffer from subjective biases and limited sensor capabilities, while existing Electroencephalogram (EEG) approaches struggle to decode complex neural patterns evoked by sour, sweet, bitter, and salty stimuli due to noise sensitivity and inadequate multi-scale feature integration.</div></div><div><h3>New method</h3><div>To address this, we propose Taste EEG Decoding Network (TEDNet), a novel deep learning architecture integrating: 1) a Temporal Spatial Convolution Module (TSCM) capturing electrode-wise dependencies, 2) a Temporal Spatial Attention Module (TSAM) adaptively reweighting critical features, and 3) a Local Global Fusion Module (LGFM) combines the local features of taste EEG with the global ones.</div></div><div><h3>Results</h3><div>Evaluated on a well-controlled dataset containing 2400 EEG samples from 30 subjects, the accuracy of TEDNet is 98.92 %, the F1-score is 98.75 %, and the Kappa coefficient is 98.49 %.</div></div><div><h3>Comparison with existing methods</h3><div>While maintaining computational efficiency, TEDNet has surpassed the existing advanced convolution and self-attention methods.</div></div><div><h3>Conclusions</h3><div>This framework establishes a robust solution for objective taste perception decoding, advancing sensory evaluation in food science.</div></div>","PeriodicalId":16415,"journal":{"name":"Journal of Neuroscience Methods","volume":"424 ","pages":"Article 110594"},"PeriodicalIF":2.3000,"publicationDate":"2025-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Neuroscience Methods","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165027025002389","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Traditional taste evaluation methods suffer from subjective biases and limited sensor capabilities, while existing Electroencephalogram (EEG) approaches struggle to decode complex neural patterns evoked by sour, sweet, bitter, and salty stimuli due to noise sensitivity and inadequate multi-scale feature integration.
New method
To address this, we propose Taste EEG Decoding Network (TEDNet), a novel deep learning architecture integrating: 1) a Temporal Spatial Convolution Module (TSCM) capturing electrode-wise dependencies, 2) a Temporal Spatial Attention Module (TSAM) adaptively reweighting critical features, and 3) a Local Global Fusion Module (LGFM) combines the local features of taste EEG with the global ones.
Results
Evaluated on a well-controlled dataset containing 2400 EEG samples from 30 subjects, the accuracy of TEDNet is 98.92 %, the F1-score is 98.75 %, and the Kappa coefficient is 98.49 %.
Comparison with existing methods
While maintaining computational efficiency, TEDNet has surpassed the existing advanced convolution and self-attention methods.
Conclusions
This framework establishes a robust solution for objective taste perception decoding, advancing sensory evaluation in food science.
期刊介绍:
The Journal of Neuroscience Methods publishes papers that describe new methods that are specifically for neuroscience research conducted in invertebrates, vertebrates or in man. Major methodological improvements or important refinements of established neuroscience methods are also considered for publication. The Journal''s Scope includes all aspects of contemporary neuroscience research, including anatomical, behavioural, biochemical, cellular, computational, molecular, invasive and non-invasive imaging, optogenetic, and physiological research investigations.