Automatic detection of breast lesions in automated 3D breast ultrasound with cross-organ transfer learning

Q1 Computer Science
B.A.O. Lingyun , Zhengrui HUANG , Zehui LIN , Yue SUN , Hui CHEN , You LI , Zhang LI , Xiaochen YUAN , Lin XU , Tao TAN
{"title":"Automatic detection of breast lesions in automated 3D breast ultrasound with cross-organ transfer learning","authors":"B.A.O. Lingyun ,&nbsp;Zhengrui HUANG ,&nbsp;Zehui LIN ,&nbsp;Yue SUN ,&nbsp;Hui CHEN ,&nbsp;You LI ,&nbsp;Zhang LI ,&nbsp;Xiaochen YUAN ,&nbsp;Lin XU ,&nbsp;Tao TAN","doi":"10.1016/j.vrih.2024.02.001","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><p>Deep convolutional neural networks have garnered considerable attention in numerous machine learning applications, particularly in visual recognition tasks such as image and video analyses. There is a growing interest in applying this technology to diverse applications in medical image analysis. Automated three-dimensional Breast Ultrasound is a vital tool for detecting breast cancer, and computer-assisted diagnosis software, developed based on deep learning, can effectively assist radiologists in diagnosis. However, the network model is prone to overfitting during training, owing to challenges such as insufficient training data. This study attempts to solve the problem caused by small datasets and improve model detection performance.</p></div><div><h3>Methods</h3><p>We propose a breast cancer detection framework based on deep learning (a transfer learning method based on cross-organ cancer detection) and a contrastive learning method based on breast imaging reporting and data systems (BI-RADS).</p></div><div><h3>Results</h3><p>When using cross organ transfer learning and BIRADS based contrastive learning, the average sensitivity of the model increased by a maximum of 16.05%.</p></div><div><h3>Conclusion</h3><p>Our experiments have demonstrated that the parameters and experiences of cross-organ cancer detection can be mutually referenced, and contrastive learning method based on BI-RADS can improve the detection performance of the model.</p></div>","PeriodicalId":33538,"journal":{"name":"Virtual Reality Intelligent Hardware","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S209657962400007X/pdfft?md5=a1bdf0d74f499e2548f6f5735dd9b5bf&pid=1-s2.0-S209657962400007X-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality Intelligent Hardware","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S209657962400007X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0

Abstract

Background

Deep convolutional neural networks have garnered considerable attention in numerous machine learning applications, particularly in visual recognition tasks such as image and video analyses. There is a growing interest in applying this technology to diverse applications in medical image analysis. Automated three-dimensional Breast Ultrasound is a vital tool for detecting breast cancer, and computer-assisted diagnosis software, developed based on deep learning, can effectively assist radiologists in diagnosis. However, the network model is prone to overfitting during training, owing to challenges such as insufficient training data. This study attempts to solve the problem caused by small datasets and improve model detection performance.

Methods

We propose a breast cancer detection framework based on deep learning (a transfer learning method based on cross-organ cancer detection) and a contrastive learning method based on breast imaging reporting and data systems (BI-RADS).

Results

When using cross organ transfer learning and BIRADS based contrastive learning, the average sensitivity of the model increased by a maximum of 16.05%.

Conclusion

Our experiments have demonstrated that the parameters and experiences of cross-organ cancer detection can be mutually referenced, and contrastive learning method based on BI-RADS can improve the detection performance of the model.

利用跨器官迁移学习在自动三维乳腺超声中自动检测乳腺病变
背景深层卷积神经网络在众多机器学习应用中,尤其是在图像和视频分析等视觉识别任务中,已经引起了广泛关注。人们对将这一技术应用于医学图像分析的各种应用越来越感兴趣。自动三维乳腺超声波检查是检测乳腺癌的重要工具,基于深度学习开发的计算机辅助诊断软件可以有效地协助放射科医生进行诊断。然而,由于训练数据不足等难题,网络模型在训练过程中容易出现过拟合。方法我们提出了一种基于深度学习的乳腺癌检测框架(一种基于跨器官癌症检测的迁移学习方法)和一种基于乳腺成像报告和数据系统(BI-RADS)的对比学习方法。结果当使用跨器官转移学习和基于 BIRADS 的对比学习时,模型的平均灵敏度最高提高了 16.05%。结论我们的实验证明,跨器官癌症检测的参数和经验可以相互参考,而基于 BI-RADS 的对比学习方法可以提高模型的检测性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Virtual Reality  Intelligent Hardware
Virtual Reality Intelligent Hardware Computer Science-Computer Graphics and Computer-Aided Design
CiteScore
6.40
自引率
0.00%
发文量
35
审稿时长
12 weeks
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信