Multi-Part Knowledge Distillation for the Efficient Classification of Colorectal Cancer Histology Images

Shankey Garg, Pradeep Singh
{"title":"Multi-Part Knowledge Distillation for the Efficient Classification of Colorectal Cancer Histology Images","authors":"Shankey Garg, Pradeep Singh","doi":"10.1109/IBSSC56953.2022.10037360","DOIUrl":null,"url":null,"abstract":"Colorectal cancer is the most common type of cancer after breast cancer in women and third in men after lungs and prostrate cancer. The disease rank third in incidence and second in terms of mortality, hence early diagnosis is necessary for the correct line of treatment. Knowledge distillation based models boost the performance of small neural network and are performing efficiently for various image classification based tasks. In this work, a novel knowledge distillation based technique is developed to efficiently classify colorectal cancer histology images. Unlike traditional distillation, out method performs distillation in parts. Instead of supervising the student with a converged knowledge of teacher, the proposed method is fetching the teacher's knowledge at regular intervals and providing these knowledge to the student model during student training process. Through this multi-part distillation technique student can effectively learn the intermediate representational knowledge rather than the abstract knowledge of the teacher and hence boost the overall performance of the model. The the proposed model has achived 92.10% accuracy.","PeriodicalId":426897,"journal":{"name":"2022 IEEE Bombay Section Signature Conference (IBSSC)","volume":"6 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Bombay Section Signature Conference (IBSSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IBSSC56953.2022.10037360","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Colorectal cancer is the most common type of cancer after breast cancer in women and third in men after lungs and prostrate cancer. The disease rank third in incidence and second in terms of mortality, hence early diagnosis is necessary for the correct line of treatment. Knowledge distillation based models boost the performance of small neural network and are performing efficiently for various image classification based tasks. In this work, a novel knowledge distillation based technique is developed to efficiently classify colorectal cancer histology images. Unlike traditional distillation, out method performs distillation in parts. Instead of supervising the student with a converged knowledge of teacher, the proposed method is fetching the teacher's knowledge at regular intervals and providing these knowledge to the student model during student training process. Through this multi-part distillation technique student can effectively learn the intermediate representational knowledge rather than the abstract knowledge of the teacher and hence boost the overall performance of the model. The the proposed model has achived 92.10% accuracy.
基于多部分知识精馏的结直肠癌组织学图像高效分类
结直肠癌是女性中最常见的癌症类型,仅次于乳腺癌,在男性中仅次于肺癌和前列腺癌。该病在发病率方面排名第三,在死亡率方面排名第二,因此早期诊断对于正确的治疗是必要的。基于知识蒸馏的模型提高了小型神经网络的性能,有效地完成了各种基于图像分类的任务。本文提出了一种基于知识精馏的结直肠癌组织学图像分类方法。与传统的蒸馏不同,out方法是分段蒸馏。该方法不是用教师知识的聚合来监督学生,而是在学生训练过程中定期获取教师的知识并将这些知识提供给学生模型。通过这种多部分蒸馏技术,学生可以有效地学习中间的表征性知识,而不是教师的抽象知识,从而提高模型的整体性能。该模型的准确率达到了92.10%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信