做自己的医生:温度标度自我知识蒸馏用于医学图像分类

IF 5.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Wenjie Liu, Lei Zhang, Xianliang Zhang, Xinyang Zhou, Xin Wei
{"title":"做自己的医生:温度标度自我知识蒸馏用于医学图像分类","authors":"Wenjie Liu,&nbsp;Lei Zhang,&nbsp;Xianliang Zhang,&nbsp;Xinyang Zhou,&nbsp;Xin Wei","doi":"10.1016/j.neucom.2025.130115","DOIUrl":null,"url":null,"abstract":"<div><div>Self-knowledge distillation (self-KD), which uses the student network as the teacher model, allows the model to learn knowledge by itself. It has been widely studied in various medical image tasks for constructing lightweight models to alleviate the limitations of computing resources. However, existing self-KD methods use a single temperature for distillation, ignoring the effect of temperature on different classes. In this paper, we investigate the effects of target class temperature and non-target class temperature on the performance of self-KD. Based on the above study, a temperature scaling self-knowledge distillation (TSS-KD) model is proposed, which can better balance the target class knowledge and non-target class knowledge. By adjusting the temperature scaling of different classes, the model can learn better representations by distilling the well-proportioned features. To make the network focus more on the local lesions of medical images, a regional gamma augmentation (RGA) method is proposed, which provides stronger perturbations to the same sample to generate more differentiated features. By self-regularizing the consistency of these features, the model can learn more local knowledge. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on nine medical image classification tasks of eight public datasets. Experimental results show that the proposed method outperforms state-of-the-art self-KD models and has strong generality. The code is available at <span><span>https://github.com/JeaneyLau/TSS-KD</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"638 ","pages":"Article 130115"},"PeriodicalIF":5.5000,"publicationDate":"2025-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Be your own doctor: Temperature scaling self-knowledge distillation for medical image classification\",\"authors\":\"Wenjie Liu,&nbsp;Lei Zhang,&nbsp;Xianliang Zhang,&nbsp;Xinyang Zhou,&nbsp;Xin Wei\",\"doi\":\"10.1016/j.neucom.2025.130115\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Self-knowledge distillation (self-KD), which uses the student network as the teacher model, allows the model to learn knowledge by itself. It has been widely studied in various medical image tasks for constructing lightweight models to alleviate the limitations of computing resources. However, existing self-KD methods use a single temperature for distillation, ignoring the effect of temperature on different classes. In this paper, we investigate the effects of target class temperature and non-target class temperature on the performance of self-KD. Based on the above study, a temperature scaling self-knowledge distillation (TSS-KD) model is proposed, which can better balance the target class knowledge and non-target class knowledge. By adjusting the temperature scaling of different classes, the model can learn better representations by distilling the well-proportioned features. To make the network focus more on the local lesions of medical images, a regional gamma augmentation (RGA) method is proposed, which provides stronger perturbations to the same sample to generate more differentiated features. By self-regularizing the consistency of these features, the model can learn more local knowledge. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on nine medical image classification tasks of eight public datasets. Experimental results show that the proposed method outperforms state-of-the-art self-KD models and has strong generality. The code is available at <span><span>https://github.com/JeaneyLau/TSS-KD</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"638 \",\"pages\":\"Article 130115\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2025-04-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225007878\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225007878","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

自我知识蒸馏(Self-knowledge distillation, self-KD),以学生网络为教师模型,使模型能够自行学习知识。在各种医学图像任务中,构建轻量级模型以缓解计算资源的限制已经得到了广泛的研究。然而,现有的自kd方法使用单一温度进行蒸馏,忽略了温度对不同类别的影响。本文研究了靶类温度和非靶类温度对自kd性能的影响。在此基础上,提出了一种温度缩放自知识蒸馏(TSS-KD)模型,该模型能够更好地平衡目标类知识和非目标类知识。通过调整不同类别的温度缩放,模型可以通过提取比例均匀的特征来学习更好的表示。为了使网络更关注医学图像的局部病变,提出了一种区域伽马增强(regional gamma augmentation, RGA)方法,该方法对同一样本提供更强的扰动,以产生更多的差异化特征。通过对这些特征的一致性进行自正则化,模型可以学习到更多的局部知识。为了评估该方法的有效性,在8个公开数据集的9个医学图像分类任务上进行了大量的实验。实验结果表明,该方法优于目前最先进的自kd模型,具有较强的通用性。代码可在https://github.com/JeaneyLau/TSS-KD上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Be your own doctor: Temperature scaling self-knowledge distillation for medical image classification
Self-knowledge distillation (self-KD), which uses the student network as the teacher model, allows the model to learn knowledge by itself. It has been widely studied in various medical image tasks for constructing lightweight models to alleviate the limitations of computing resources. However, existing self-KD methods use a single temperature for distillation, ignoring the effect of temperature on different classes. In this paper, we investigate the effects of target class temperature and non-target class temperature on the performance of self-KD. Based on the above study, a temperature scaling self-knowledge distillation (TSS-KD) model is proposed, which can better balance the target class knowledge and non-target class knowledge. By adjusting the temperature scaling of different classes, the model can learn better representations by distilling the well-proportioned features. To make the network focus more on the local lesions of medical images, a regional gamma augmentation (RGA) method is proposed, which provides stronger perturbations to the same sample to generate more differentiated features. By self-regularizing the consistency of these features, the model can learn more local knowledge. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on nine medical image classification tasks of eight public datasets. Experimental results show that the proposed method outperforms state-of-the-art self-KD models and has strong generality. The code is available at https://github.com/JeaneyLau/TSS-KD.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信