Classification of Histopathologic Images of Breast Cancer by Multi-teacher Small-sample Knowledge Distillation

Leiqi Wang, Hui-juan Lu
{"title":"Classification of Histopathologic Images of Breast Cancer by Multi-teacher Small-sample Knowledge Distillation","authors":"Leiqi Wang, Hui-juan Lu","doi":"10.1109/ICAICE54393.2021.00127","DOIUrl":null,"url":null,"abstract":"Model fusion can effectively improve the effect of model prediction, but it will bring about an increase in time. In this paper, the dual-stage progressive knowledge distillation is improved in combination with multi-teacher knowledge distillation technology. A simple and effective multi-teacher's Softtarget integration method is proposed in multi-teacher network knowledge distillation. Improve the guiding role of excellent models in knowledge distillation. Dual-stage progressive knowledge distillation is a method for small sample knowledge distillation. A progressive network grafting method is used to realize knowledge distillation in a small sample environment. In the first step, the student blocks are grafted one by one onto the teacher network and intertwined with other teacher blocks for training, and the training process only updates the parameters of the grafted blocks. In the second step, the trained student blocks are grafted onto the teacher network in turn, so that the learned student blocks adapt to each other and finally replace the teacher network to obtain a lighter network structure. Using Softtarget acquired by this method in Dual-stage progressive knowledge distillation instead of Hardtarget training, excellent results were obtained on BreakHis data sets.","PeriodicalId":388444,"journal":{"name":"2021 2nd International Conference on Artificial Intelligence and Computer Engineering (ICAICE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 2nd International Conference on Artificial Intelligence and Computer Engineering (ICAICE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAICE54393.2021.00127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Model fusion can effectively improve the effect of model prediction, but it will bring about an increase in time. In this paper, the dual-stage progressive knowledge distillation is improved in combination with multi-teacher knowledge distillation technology. A simple and effective multi-teacher's Softtarget integration method is proposed in multi-teacher network knowledge distillation. Improve the guiding role of excellent models in knowledge distillation. Dual-stage progressive knowledge distillation is a method for small sample knowledge distillation. A progressive network grafting method is used to realize knowledge distillation in a small sample environment. In the first step, the student blocks are grafted one by one onto the teacher network and intertwined with other teacher blocks for training, and the training process only updates the parameters of the grafted blocks. In the second step, the trained student blocks are grafted onto the teacher network in turn, so that the learned student blocks adapt to each other and finally replace the teacher network to obtain a lighter network structure. Using Softtarget acquired by this method in Dual-stage progressive knowledge distillation instead of Hardtarget training, excellent results were obtained on BreakHis data sets.
基于多教师小样本知识精馏的乳腺癌组织病理图像分类
模型融合可以有效地提高模型预测的效果,但会带来时间上的增加。本文结合多教师知识蒸馏技术,对双阶段渐进式知识蒸馏进行了改进。在多教师网络知识提炼中,提出了一种简单有效的多教师软目标集成方法。提高优秀模型在知识提炼中的指导作用。双阶段递进式知识蒸馏是一种小样本知识蒸馏方法。采用渐进式网络嫁接方法实现小样本环境下的知识蒸馏。第一步,将学生块逐一嫁接到教师网络中,并与其他教师块相互交织进行训练,训练过程仅更新嫁接块的参数。第二步,将训练好的学生块依次嫁接到教师网络中,使学习到的学生块相互适应,最终取代教师网络,得到更轻的网络结构。利用该方法获得的软目标进行双阶段渐进式知识精馏,代替硬目标训练,在BreakHis数据集上取得了较好的效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信