MED-TEX: Transfer and Explain Knowledge with Less Data from Pretrained Medical Imaging Models

Thanh Nguyen-Duc, He Zhao, Jianfei Cai, Dinh Q. Phung
{"title":"MED-TEX: Transfer and Explain Knowledge with Less Data from Pretrained Medical Imaging Models","authors":"Thanh Nguyen-Duc, He Zhao, Jianfei Cai, Dinh Q. Phung","doi":"10.1109/ISBI52829.2022.9761709","DOIUrl":null,"url":null,"abstract":"Deep learning methods usually require a large amount of training data and lack interpretability. In this paper, we pro-pose a novel knowledge distillation and model interpretation framework for medical image classification that jointly solves the above two issues. Specifically, to address the data-hungry issue, a small student model is learned with less data by distilling knowledge from a cumbersome pretrained teacher model. To interpret the teacher model and assist the learning of the student, an explainer module is introduced to highlight the regions of an input that are important for the predictions of the teacher model. Furthermore, the joint framework is trained by a principled way derived from the information-theoretic perspective. Our framework outperforms on the knowledge distillation and model interpretation tasks com-pared to state-of-the-art methods on a fundus dataset.","PeriodicalId":6827,"journal":{"name":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","volume":"53 1","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI52829.2022.9761709","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Deep learning methods usually require a large amount of training data and lack interpretability. In this paper, we pro-pose a novel knowledge distillation and model interpretation framework for medical image classification that jointly solves the above two issues. Specifically, to address the data-hungry issue, a small student model is learned with less data by distilling knowledge from a cumbersome pretrained teacher model. To interpret the teacher model and assist the learning of the student, an explainer module is introduced to highlight the regions of an input that are important for the predictions of the teacher model. Furthermore, the joint framework is trained by a principled way derived from the information-theoretic perspective. Our framework outperforms on the knowledge distillation and model interpretation tasks com-pared to state-of-the-art methods on a fundus dataset.
MED-TEX:用更少的数据从预训练的医学成像模型中转移和解释知识
深度学习方法通常需要大量的训练数据,并且缺乏可解释性。本文提出了一种新的医学图像分类知识提炼和模型解释框架,共同解决了上述两个问题。具体来说,为了解决数据匮乏的问题,通过从繁琐的预训练教师模型中提取知识,使用更少的数据来学习小型学生模型。为了解释教师模型并帮助学生学习,引入了一个解释器模块来突出显示对教师模型预测重要的输入区域。此外,从信息论的角度推导出一种原则性的方法来训练联合框架。与最先进的方法相比,我们的框架在基础数据集的知识蒸馏和模型解释任务上表现出色。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信