[A joint distillation model for the tumor segmentation using breast ultrasound images].

Q4 Medicine
Hongjiang Guo, Youyou Ding, Hao Dang, Tongtong Liu, Xuekun Song, Ge Zhang, Shuo Yao, Daisen Hou, Zongwang Lyu
{"title":"[A joint distillation model for the tumor segmentation using breast ultrasound images].","authors":"Hongjiang Guo, Youyou Ding, Hao Dang, Tongtong Liu, Xuekun Song, Ge Zhang, Shuo Yao, Daisen Hou, Zongwang Lyu","doi":"10.7507/1001-5515.202311054","DOIUrl":null,"url":null,"abstract":"<p><p>The accurate segmentation of breast ultrasound images is an important precondition for the lesion determination. The existing segmentation approaches embrace massive parameters, sluggish inference speed, and huge memory consumption. To tackle this problem, we propose T <sup>2</sup>KD Attention U-Net (dual-Teacher Knowledge Distillation Attention U-Net), a lightweight semantic segmentation method combined double-path joint distillation in breast ultrasound images. Primarily, we designed two teacher models to learn the fine-grained features from each class of images according to different feature representation and semantic information of benign and malignant breast lesions. Then we leveraged the joint distillation to train a lightweight student model. Finally, we constructed a novel weight balance loss to focus on the semantic feature of small objection, solving the unbalance problem of tumor and background. Specifically, the extensive experiments conducted on Dataset BUSI and Dataset B demonstrated that the T <sup>2</sup>KD Attention U-Net outperformed various knowledge distillation counterparts. Concretely, the accuracy, recall, precision, Dice, and mIoU of proposed method were 95.26%, 86.23%, 85.09%, 83.59%and 77.78% on Dataset BUSI, respectively. And these performance indexes were 97.95%, 92.80%, 88.33%, 88.40% and 82.42% on Dataset B, respectively. Compared with other models, the performance of this model was significantly improved. Meanwhile, compared with the teacher model, the number, size, and complexity of student model were significantly reduced (2.2×10 <sup>6</sup> <i>vs</i>. 106.1×10 <sup>6</sup>, 8.4 MB <i>vs</i>. 414 MB, 16.59 GFLOPs <i>vs</i>. 205.98 GFLOPs, respectively). Indeedy, the proposed model guarantees the performances while greatly decreasing the amount of computation, which provides a new method for the deployment of clinical medical scenarios.</p>","PeriodicalId":39324,"journal":{"name":"生物医学工程学杂志","volume":"42 1","pages":"148-155"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11955334/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"生物医学工程学杂志","FirstCategoryId":"1087","ListUrlMain":"https://doi.org/10.7507/1001-5515.202311054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0

Abstract

The accurate segmentation of breast ultrasound images is an important precondition for the lesion determination. The existing segmentation approaches embrace massive parameters, sluggish inference speed, and huge memory consumption. To tackle this problem, we propose T 2KD Attention U-Net (dual-Teacher Knowledge Distillation Attention U-Net), a lightweight semantic segmentation method combined double-path joint distillation in breast ultrasound images. Primarily, we designed two teacher models to learn the fine-grained features from each class of images according to different feature representation and semantic information of benign and malignant breast lesions. Then we leveraged the joint distillation to train a lightweight student model. Finally, we constructed a novel weight balance loss to focus on the semantic feature of small objection, solving the unbalance problem of tumor and background. Specifically, the extensive experiments conducted on Dataset BUSI and Dataset B demonstrated that the T 2KD Attention U-Net outperformed various knowledge distillation counterparts. Concretely, the accuracy, recall, precision, Dice, and mIoU of proposed method were 95.26%, 86.23%, 85.09%, 83.59%and 77.78% on Dataset BUSI, respectively. And these performance indexes were 97.95%, 92.80%, 88.33%, 88.40% and 82.42% on Dataset B, respectively. Compared with other models, the performance of this model was significantly improved. Meanwhile, compared with the teacher model, the number, size, and complexity of student model were significantly reduced (2.2×10 6 vs. 106.1×10 6, 8.4 MB vs. 414 MB, 16.59 GFLOPs vs. 205.98 GFLOPs, respectively). Indeedy, the proposed model guarantees the performances while greatly decreasing the amount of computation, which provides a new method for the deployment of clinical medical scenarios.

乳腺超声图像肿瘤分割的联合蒸馏模型
乳腺超声图像的准确分割是确定病变的重要前提。现有的分割方法包含大量的参数、缓慢的推理速度和巨大的内存消耗。为了解决这一问题,我们提出了一种结合双路联合蒸馏的乳腺超声图像轻量级语义分割方法t2kd Attention U-Net (dual-Teacher Knowledge Distillation Attention U-Net)。首先,我们设计了两个教师模型,根据乳腺良恶性病变的不同特征表示和语义信息,从每一类图像中学习细粒度特征。然后我们利用联合蒸馏来训练一个轻量级的学生模型。最后,针对小目标的语义特征,构建了一种新的权重平衡损失算法,解决了肿瘤和背景的不平衡问题。具体而言,在数据集BUSI和数据集B上进行的大量实验表明,t2kd注意力U-Net优于各种知识蒸馏对应的方法。具体而言,该方法在BUSI数据集上的准确率为95.26%,召回率为86.23%,精密度为85.09%,Dice为83.59%,mIoU为77.78%。在数据集B上,这些性能指标分别为97.95%、92.80%、88.33%、88.40%和82.42%。与其他模型相比,该模型的性能得到了显著提高。同时,与教师模型相比,学生模型的数量、大小和复杂性均显著降低(2.2×10 6 vs. 106.1×10 6, 8.4 MB vs. 414 MB, 16.59 GFLOPs vs. 205.98 GFLOPs)。该模型在保证性能的同时大大减少了计算量,为临床医疗场景的部署提供了一种新的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
生物医学工程学杂志
生物医学工程学杂志 Medicine-Medicine (all)
CiteScore
0.80
自引率
0.00%
发文量
4868
期刊介绍:
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信