SemSTNet:基于预训练语言模型生成类原型的医学脑电语义度量学习。

IF 4.5 2区 医学 Q2 ENGINEERING, BIOMEDICAL
Quanlin Chen, Chunjin Ye, Rui Xiao, Jiahui Pan, Jingcong Li
{"title":"SemSTNet:基于预训练语言模型生成类原型的医学脑电语义度量学习。","authors":"Quanlin Chen, Chunjin Ye, Rui Xiao, Jiahui Pan, Jingcong Li","doi":"10.1109/TBME.2025.3620754","DOIUrl":null,"url":null,"abstract":"<p><p>Electroencephalography (EEG) feature learning is crucial for brain-machine interfaces and medical diagnostics. Existing deep learning models for classification often overlook the intrinsic semantic relationships between different EEG classes and rely on overly complex models with a large number of parameters. To address these challenges, we propose SemSTNet, a novel and lightweight framework for EEG analysis. Firstly, we designed an e ficient, lightweight convolutional architecture that decouples spatial and temporal feature extraction. Then we propose a framework which introduces a novel semantic metric learning paradigm that uses class prototypes generated by a pretrained language model to better capture inter-class relationships and enhance intra-class compactness. These prototypes are extracted and stored offline, requiring no additional inference from the language model during training or deployment. This design significantly reduces model complexity, resulting in a model with only 23K parameters-over 100 times fewer than common Transformer-based models. Exten sive experiments demonstrate that SemSTNet outperforms state of-the-art approaches on tasks such as epilepsy classification and sleep staging, highlighting its effectiveness and efficiency. Our work demonstrates that integrating semantic knowledge with a purpose-built lightweight architecture provides a highly effective and efficient solution.</p>","PeriodicalId":13245,"journal":{"name":"IEEE Transactions on Biomedical Engineering","volume":"PP ","pages":""},"PeriodicalIF":4.5000,"publicationDate":"2025-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SemSTNet: Medical EEG Semantic Metric Learning with Class Prototypes Generated by Pretrained Language Model.\",\"authors\":\"Quanlin Chen, Chunjin Ye, Rui Xiao, Jiahui Pan, Jingcong Li\",\"doi\":\"10.1109/TBME.2025.3620754\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Electroencephalography (EEG) feature learning is crucial for brain-machine interfaces and medical diagnostics. Existing deep learning models for classification often overlook the intrinsic semantic relationships between different EEG classes and rely on overly complex models with a large number of parameters. To address these challenges, we propose SemSTNet, a novel and lightweight framework for EEG analysis. Firstly, we designed an e ficient, lightweight convolutional architecture that decouples spatial and temporal feature extraction. Then we propose a framework which introduces a novel semantic metric learning paradigm that uses class prototypes generated by a pretrained language model to better capture inter-class relationships and enhance intra-class compactness. These prototypes are extracted and stored offline, requiring no additional inference from the language model during training or deployment. This design significantly reduces model complexity, resulting in a model with only 23K parameters-over 100 times fewer than common Transformer-based models. Exten sive experiments demonstrate that SemSTNet outperforms state of-the-art approaches on tasks such as epilepsy classification and sleep staging, highlighting its effectiveness and efficiency. Our work demonstrates that integrating semantic knowledge with a purpose-built lightweight architecture provides a highly effective and efficient solution.</p>\",\"PeriodicalId\":13245,\"journal\":{\"name\":\"IEEE Transactions on Biomedical Engineering\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2025-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Biomedical Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1109/TBME.2025.3620754\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Biomedical Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/TBME.2025.3620754","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

摘要

脑电图(EEG)特征学习在脑机接口和医学诊断中至关重要。现有的深度学习分类模型往往忽略了不同EEG类别之间的内在语义关系,依赖于过于复杂的模型和大量的参数。为了解决这些挑战,我们提出了SemSTNet,一种新颖的轻量级EEG分析框架。首先,我们设计了一个高效,轻量级的卷积架构,将空间和时间特征提取解耦。然后,我们提出了一个框架,该框架引入了一种新的语义度量学习范式,该范式使用由预训练的语言模型生成的类原型来更好地捕获类间关系并增强类内紧密性。这些原型是离线提取和存储的,在训练或部署期间不需要从语言模型中进行额外的推断。这种设计显著降低了模型的复杂性,使模型只有23K个参数,比普通的基于变压器的模型少100多倍。广泛的实验表明,SemSTNet在癫痫分类和睡眠分期等任务上优于最先进的方法,突出了其有效性和效率。我们的工作表明,将语义知识与专门构建的轻量级体系结构集成可以提供高效的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SemSTNet: Medical EEG Semantic Metric Learning with Class Prototypes Generated by Pretrained Language Model.

Electroencephalography (EEG) feature learning is crucial for brain-machine interfaces and medical diagnostics. Existing deep learning models for classification often overlook the intrinsic semantic relationships between different EEG classes and rely on overly complex models with a large number of parameters. To address these challenges, we propose SemSTNet, a novel and lightweight framework for EEG analysis. Firstly, we designed an e ficient, lightweight convolutional architecture that decouples spatial and temporal feature extraction. Then we propose a framework which introduces a novel semantic metric learning paradigm that uses class prototypes generated by a pretrained language model to better capture inter-class relationships and enhance intra-class compactness. These prototypes are extracted and stored offline, requiring no additional inference from the language model during training or deployment. This design significantly reduces model complexity, resulting in a model with only 23K parameters-over 100 times fewer than common Transformer-based models. Exten sive experiments demonstrate that SemSTNet outperforms state of-the-art approaches on tasks such as epilepsy classification and sleep staging, highlighting its effectiveness and efficiency. Our work demonstrates that integrating semantic knowledge with a purpose-built lightweight architecture provides a highly effective and efficient solution.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Biomedical Engineering
IEEE Transactions on Biomedical Engineering 工程技术-工程:生物医学
CiteScore
9.40
自引率
4.30%
发文量
880
审稿时长
2.5 months
期刊介绍: IEEE Transactions on Biomedical Engineering contains basic and applied papers dealing with biomedical engineering. Papers range from engineering development in methods and techniques with biomedical applications to experimental and clinical investigations with engineering contributions.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信