一种新的多模态自监督心电图心律失常分类框架。

IF 6.3 2区 医学 Q1 BIOLOGY
Jianqiang Hu, Cheng Li, Jinde Cao, Bo Kou
{"title":"一种新的多模态自监督心电图心律失常分类框架。","authors":"Jianqiang Hu, Cheng Li, Jinde Cao, Bo Kou","doi":"10.1016/j.compbiomed.2025.111137","DOIUrl":null,"url":null,"abstract":"<p><p>The electrocardiogram (ECG) has emerged as a primary tool in clinical practice for identifying cardiovascular diseases, owing to its low cost, simplicity, and non-invasiveness. Given the high cost associated with acquiring a substantial amount of ECG signals that require annotation by medical professionals, advanced self-supervised learning (SSL) techniques can effectively leverage abundant unlabeled data for learning, mitigating the performance impact of insufficient ECG classification labels. Contrastive learning has been successful as a self-supervised pre-training approach in image and time series domains. Inspired by this success, a novel pre-training technique, i.e., a simple multimodal self-supervised framework for ECG arrhythmia classification, is proposed in this paper by utilizing multi-modal data from ECG signals to enhance model initialization. Compared to other modalities, the expectation is that representations based on time and frequency for the same example should be brought as close together as possible. The pre-training is achieved through self-supervision by constructing time-domain contrastive learning loss and time-frequency loss, effectively learning features of ECG signals. The proposed method evaluates datasets containing both multi-lead and single-lead ECG data. Experimental results demonstrate that, by applying the pre-training method followed by fine-tuning for downstream tasks, the proposed algorithm outperforms standard contrastive learning paradigms on ACC and AUC, respectively, and even outperforms supervised learning.</p>","PeriodicalId":10578,"journal":{"name":"Computers in biology and medicine","volume":"198 Pt A","pages":"111137"},"PeriodicalIF":6.3000,"publicationDate":"2025-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A novel multimodal self-supervised framework for ECG arrhythmia classification.\",\"authors\":\"Jianqiang Hu, Cheng Li, Jinde Cao, Bo Kou\",\"doi\":\"10.1016/j.compbiomed.2025.111137\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The electrocardiogram (ECG) has emerged as a primary tool in clinical practice for identifying cardiovascular diseases, owing to its low cost, simplicity, and non-invasiveness. Given the high cost associated with acquiring a substantial amount of ECG signals that require annotation by medical professionals, advanced self-supervised learning (SSL) techniques can effectively leverage abundant unlabeled data for learning, mitigating the performance impact of insufficient ECG classification labels. Contrastive learning has been successful as a self-supervised pre-training approach in image and time series domains. Inspired by this success, a novel pre-training technique, i.e., a simple multimodal self-supervised framework for ECG arrhythmia classification, is proposed in this paper by utilizing multi-modal data from ECG signals to enhance model initialization. Compared to other modalities, the expectation is that representations based on time and frequency for the same example should be brought as close together as possible. The pre-training is achieved through self-supervision by constructing time-domain contrastive learning loss and time-frequency loss, effectively learning features of ECG signals. The proposed method evaluates datasets containing both multi-lead and single-lead ECG data. Experimental results demonstrate that, by applying the pre-training method followed by fine-tuning for downstream tasks, the proposed algorithm outperforms standard contrastive learning paradigms on ACC and AUC, respectively, and even outperforms supervised learning.</p>\",\"PeriodicalId\":10578,\"journal\":{\"name\":\"Computers in biology and medicine\",\"volume\":\"198 Pt A\",\"pages\":\"111137\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-10-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in biology and medicine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1016/j.compbiomed.2025.111137\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in biology and medicine","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1016/j.compbiomed.2025.111137","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

心电图(ECG)由于其低成本、简单和无创性,已成为临床实践中识别心血管疾病的主要工具。鉴于获取大量需要医学专业人员注释的心电信号的高成本,先进的自监督学习(SSL)技术可以有效地利用大量未标记的数据进行学习,减轻心电分类标签不足对性能的影响。对比学习作为一种自监督预训练方法在图像和时间序列领域取得了成功。受此成功启发,本文提出了一种新的预训练技术,即利用心电信号中的多模态数据来增强模型初始化,即一个简单的多模态自监督心电心律失常分类框架。与其他模态相比,期望是基于时间和频率的表示应该尽可能地结合在一起。预训练通过构建时域对比学习损失和时频损失进行自我监督,有效学习心电信号的特征。提出的方法评估包含多导联和单导联心电数据的数据集。实验结果表明,该算法通过对下游任务进行预训练和微调,分别优于ACC和AUC上的标准对比学习范式,甚至优于监督学习。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A novel multimodal self-supervised framework for ECG arrhythmia classification.

The electrocardiogram (ECG) has emerged as a primary tool in clinical practice for identifying cardiovascular diseases, owing to its low cost, simplicity, and non-invasiveness. Given the high cost associated with acquiring a substantial amount of ECG signals that require annotation by medical professionals, advanced self-supervised learning (SSL) techniques can effectively leverage abundant unlabeled data for learning, mitigating the performance impact of insufficient ECG classification labels. Contrastive learning has been successful as a self-supervised pre-training approach in image and time series domains. Inspired by this success, a novel pre-training technique, i.e., a simple multimodal self-supervised framework for ECG arrhythmia classification, is proposed in this paper by utilizing multi-modal data from ECG signals to enhance model initialization. Compared to other modalities, the expectation is that representations based on time and frequency for the same example should be brought as close together as possible. The pre-training is achieved through self-supervision by constructing time-domain contrastive learning loss and time-frequency loss, effectively learning features of ECG signals. The proposed method evaluates datasets containing both multi-lead and single-lead ECG data. Experimental results demonstrate that, by applying the pre-training method followed by fine-tuning for downstream tasks, the proposed algorithm outperforms standard contrastive learning paradigms on ACC and AUC, respectively, and even outperforms supervised learning.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers in biology and medicine
Computers in biology and medicine 工程技术-工程:生物医学
CiteScore
11.70
自引率
10.40%
发文量
1086
审稿时长
74 days
期刊介绍: Computers in Biology and Medicine is an international forum for sharing groundbreaking advancements in the use of computers in bioscience and medicine. This journal serves as a medium for communicating essential research, instruction, ideas, and information regarding the rapidly evolving field of computer applications in these domains. By encouraging the exchange of knowledge, we aim to facilitate progress and innovation in the utilization of computers in biology and medicine.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信