DistAL: A Domain-Shift Active Learning Framework With Transferable Feature Learning for Lesion Detection

Fan Bai;Ran Wei;Xiaoyu Bai;Dakai Jin;Xianghua Ye;Le Lu;Ke Yan;Max Q.-H. Meng
{"title":"DistAL: A Domain-Shift Active Learning Framework With Transferable Feature Learning for Lesion Detection","authors":"Fan Bai;Ran Wei;Xiaoyu Bai;Dakai Jin;Xianghua Ye;Le Lu;Ke Yan;Max Q.-H. Meng","doi":"10.1109/TMI.2025.3558861","DOIUrl":null,"url":null,"abstract":"Deep learning has demonstrated exceptional performance in medical image analysis, but its effectiveness degrades significantly when applied to different medical centers due to domain shifts. Lesion detection, a critical task in medical imaging, is particularly impacted by this challenge due to the diversity and complexity of lesions, which can arise from different organs, diseases, imaging devices, and other factors. While collecting data and labels from target domains is a feasible solution, annotating medical images is often tedious, expensive, and requires professionals. To address this problem, we combine active learning with domain-invariant feature learning. We propose a Domain-shift Active Learning (DistAL) framework, which includes a transferable feature learning algorithm and a hybrid sample selection strategy. Feature learning incorporates contrastive-consistency training to learn discriminative and domain-invariant features. The sample selection strategy is called RUDY, which jointly considers Representativeness, Uncertainty, and DiversitY. Its goal is to select samples from the unlabeled target domain for cost-effective annotation. It first selects representative samples to deal with domain shift, as well as uncertain ones to improve class separability, and then leverages K-means++ initialization to remove redundant candidates to achieve diversity. We evaluate our method for the task of lesion detection. By selecting only 1.7% samples from the target domain to annotate, DistAL achieves comparable performance to the method trained with all target labels. It outperforms other AL methods in five experiments on eight datasets collected from different hospitals, using different imaging protocols, annotation conventions, and etiologies.","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 7","pages":"3038-3050"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10964759/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep learning has demonstrated exceptional performance in medical image analysis, but its effectiveness degrades significantly when applied to different medical centers due to domain shifts. Lesion detection, a critical task in medical imaging, is particularly impacted by this challenge due to the diversity and complexity of lesions, which can arise from different organs, diseases, imaging devices, and other factors. While collecting data and labels from target domains is a feasible solution, annotating medical images is often tedious, expensive, and requires professionals. To address this problem, we combine active learning with domain-invariant feature learning. We propose a Domain-shift Active Learning (DistAL) framework, which includes a transferable feature learning algorithm and a hybrid sample selection strategy. Feature learning incorporates contrastive-consistency training to learn discriminative and domain-invariant features. The sample selection strategy is called RUDY, which jointly considers Representativeness, Uncertainty, and DiversitY. Its goal is to select samples from the unlabeled target domain for cost-effective annotation. It first selects representative samples to deal with domain shift, as well as uncertain ones to improve class separability, and then leverages K-means++ initialization to remove redundant candidates to achieve diversity. We evaluate our method for the task of lesion detection. By selecting only 1.7% samples from the target domain to annotate, DistAL achieves comparable performance to the method trained with all target labels. It outperforms other AL methods in five experiments on eight datasets collected from different hospitals, using different imaging protocols, annotation conventions, and etiologies.
DistAL: 利用可转移特征学习进行病变检测的领域转移主动学习框架
深度学习在医学图像分析中表现出优异的性能,但当应用于不同的医疗中心时,由于域移位,其有效性显着下降。病变检测是医学成像中的一项关键任务,由于病变的多样性和复杂性,病变可能来自不同的器官、疾病、成像设备和其他因素,因此尤其受到这一挑战的影响。虽然从目标领域收集数据和标签是一种可行的解决方案,但对医学图像进行注释通常既繁琐又昂贵,而且需要专业人员。为了解决这个问题,我们将主动学习与领域不变特征学习相结合。我们提出了一个域转移主动学习(远端)框架,该框架包括可转移特征学习算法和混合样本选择策略。特征学习结合了对比一致性训练来学习判别特征和域不变特征。样本选择策略被称为RUDY,它共同考虑代表性、不确定性和多样性。它的目标是从未标记的目标域中选择样本进行经济有效的注释。它首先选择有代表性的样本来处理域漂移,以及不确定的样本来提高类的可分性,然后利用k -means++初始化去除冗余的候选样本来实现多样性。我们评估了我们的方法对病变检测的任务。通过仅从目标域中选择1.7%的样本进行标注,远端方法的性能与使用所有目标标签训练的方法相当。在不同医院收集的8个数据集上,使用不同的成像协议、注释惯例和病因,在5个实验中,它优于其他人工智能方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信