对比预训练在临床前模型中改善心内膜电图的深度学习分类

IF 2.5 Q2 CARDIAC & CARDIOVASCULAR SYSTEMS
Bram Hunt BS , Eugene Kwan PhD , Jake Bergquist PhD , James Brundage MD , Benjamin Orkild BS , Jiawei Dong PhD , Eric Paccione MS , Kyoichiro Yazaki MD , Rob S. MacLeod PhD , Derek J. Dosdall PhD , Tolga Tasdizen PhD , Ravi Ranjan MD, PhD
{"title":"对比预训练在临床前模型中改善心内膜电图的深度学习分类","authors":"Bram Hunt BS ,&nbsp;Eugene Kwan PhD ,&nbsp;Jake Bergquist PhD ,&nbsp;James Brundage MD ,&nbsp;Benjamin Orkild BS ,&nbsp;Jiawei Dong PhD ,&nbsp;Eric Paccione MS ,&nbsp;Kyoichiro Yazaki MD ,&nbsp;Rob S. MacLeod PhD ,&nbsp;Derek J. Dosdall PhD ,&nbsp;Tolga Tasdizen PhD ,&nbsp;Ravi Ranjan MD, PhD","doi":"10.1016/j.hroo.2025.01.008","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Rotors and focal ectopies, or “drivers,” are hypothesized mechanisms of persistent atrial fibrillation (AF). Machine learning algorithms have been used to identify these drivers, but the limited size of current driver data sets constrains their performance.</div></div><div><h3>Objective</h3><div>We proposed that pretraining using unsupervised learning on a substantial data set of unlabeled electrograms could enhance classifier accuracy when applied to a smaller driver data set.</div></div><div><h3>Methods</h3><div>We used a SimCLR-based framework to pretrain a residual neural network on 113,000 unlabeled 64-electrode measurements from a canine model of AF. The network was then fine-tuned to identify drivers from intracardiac electrograms. Various augmentations, including cropping, Gaussian blurring, and rotation, were applied during pretraining to improve the robustness of the learned representations.</div></div><div><h3>Results</h3><div>Pretraining significantly improved driver detection accuracy compared with a non-pretrained network (80.8% vs 62.5%). The pretrained network also demonstrated greater resilience to reductions in training data set size, maintaining higher accuracy even with a 30% reduction in data. Gradient-weighted Class Activation Mapping analysis revealed that the network’s attention aligned well with manually annotated driver regions, suggesting that the network learned meaningful features for driver detection.</div></div><div><h3>Conclusion</h3><div>This study demonstrates that contrastive pretraining can enhance the accuracy of driver detection algorithms in AF. The findings support the broader application of transfer learning to other electrogram-based tasks, potentially improving outcomes in clinical electrophysiology.</div></div>","PeriodicalId":29772,"journal":{"name":"Heart Rhythm O2","volume":"6 4","pages":"Pages 473-480"},"PeriodicalIF":2.5000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Contrastive pretraining improves deep learning classification of endocardial electrograms in a preclinical model\",\"authors\":\"Bram Hunt BS ,&nbsp;Eugene Kwan PhD ,&nbsp;Jake Bergquist PhD ,&nbsp;James Brundage MD ,&nbsp;Benjamin Orkild BS ,&nbsp;Jiawei Dong PhD ,&nbsp;Eric Paccione MS ,&nbsp;Kyoichiro Yazaki MD ,&nbsp;Rob S. MacLeod PhD ,&nbsp;Derek J. Dosdall PhD ,&nbsp;Tolga Tasdizen PhD ,&nbsp;Ravi Ranjan MD, PhD\",\"doi\":\"10.1016/j.hroo.2025.01.008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background</h3><div>Rotors and focal ectopies, or “drivers,” are hypothesized mechanisms of persistent atrial fibrillation (AF). Machine learning algorithms have been used to identify these drivers, but the limited size of current driver data sets constrains their performance.</div></div><div><h3>Objective</h3><div>We proposed that pretraining using unsupervised learning on a substantial data set of unlabeled electrograms could enhance classifier accuracy when applied to a smaller driver data set.</div></div><div><h3>Methods</h3><div>We used a SimCLR-based framework to pretrain a residual neural network on 113,000 unlabeled 64-electrode measurements from a canine model of AF. The network was then fine-tuned to identify drivers from intracardiac electrograms. Various augmentations, including cropping, Gaussian blurring, and rotation, were applied during pretraining to improve the robustness of the learned representations.</div></div><div><h3>Results</h3><div>Pretraining significantly improved driver detection accuracy compared with a non-pretrained network (80.8% vs 62.5%). The pretrained network also demonstrated greater resilience to reductions in training data set size, maintaining higher accuracy even with a 30% reduction in data. Gradient-weighted Class Activation Mapping analysis revealed that the network’s attention aligned well with manually annotated driver regions, suggesting that the network learned meaningful features for driver detection.</div></div><div><h3>Conclusion</h3><div>This study demonstrates that contrastive pretraining can enhance the accuracy of driver detection algorithms in AF. The findings support the broader application of transfer learning to other electrogram-based tasks, potentially improving outcomes in clinical electrophysiology.</div></div>\",\"PeriodicalId\":29772,\"journal\":{\"name\":\"Heart Rhythm O2\",\"volume\":\"6 4\",\"pages\":\"Pages 473-480\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2025-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Heart Rhythm O2\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666501825000169\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CARDIAC & CARDIOVASCULAR SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Heart Rhythm O2","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666501825000169","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CARDIAC & CARDIOVASCULAR SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

背景推动者和局灶异位,或“驱动因素”,是持续性心房颤动(AF)的假设机制。机器学习算法已被用于识别这些驱动程序,但当前驱动程序数据集的有限规模限制了它们的性能。我们提出在大量未标记的电图数据集上使用无监督学习进行预训练,当应用于较小的驾驶员数据集时,可以提高分类器的准确性。方法使用基于simclr的框架对来自犬房颤模型的113,000个未标记的64个电极测量数据进行残差神经网络预训练。然后对该网络进行微调,以从心内电图中识别驱动因素。各种增强,包括裁剪,高斯模糊和旋转,在预训练期间应用,以提高学习表征的鲁棒性。结果与非预训练网络相比,预训练网络显著提高了驾驶员检测准确率(80.8% vs 62.5%)。预训练的网络对训练数据集大小的减少也表现出更大的弹性,即使数据减少30%,也能保持更高的准确性。梯度加权类激活映射分析显示,网络的注意力与手动注释的驾驶员区域很好地对齐,这表明网络学习了有意义的驾驶员检测特征。本研究表明,对比预训练可以提高AF中驾驶员检测算法的准确性。该研究结果支持将迁移学习广泛应用于其他基于电图的任务,可能改善临床电生理学的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Contrastive pretraining improves deep learning classification of endocardial electrograms in a preclinical model

Background

Rotors and focal ectopies, or “drivers,” are hypothesized mechanisms of persistent atrial fibrillation (AF). Machine learning algorithms have been used to identify these drivers, but the limited size of current driver data sets constrains their performance.

Objective

We proposed that pretraining using unsupervised learning on a substantial data set of unlabeled electrograms could enhance classifier accuracy when applied to a smaller driver data set.

Methods

We used a SimCLR-based framework to pretrain a residual neural network on 113,000 unlabeled 64-electrode measurements from a canine model of AF. The network was then fine-tuned to identify drivers from intracardiac electrograms. Various augmentations, including cropping, Gaussian blurring, and rotation, were applied during pretraining to improve the robustness of the learned representations.

Results

Pretraining significantly improved driver detection accuracy compared with a non-pretrained network (80.8% vs 62.5%). The pretrained network also demonstrated greater resilience to reductions in training data set size, maintaining higher accuracy even with a 30% reduction in data. Gradient-weighted Class Activation Mapping analysis revealed that the network’s attention aligned well with manually annotated driver regions, suggesting that the network learned meaningful features for driver detection.

Conclusion

This study demonstrates that contrastive pretraining can enhance the accuracy of driver detection algorithms in AF. The findings support the broader application of transfer learning to other electrogram-based tasks, potentially improving outcomes in clinical electrophysiology.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Heart Rhythm O2
Heart Rhythm O2 Cardiology and Cardiovascular Medicine
CiteScore
3.30
自引率
0.00%
发文量
0
审稿时长
52 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信