UltraNet: Unleashing the Power of Simplicity for Accurate Medical Image Segmentation.

IF 3.9 2区 生物学 Q1 MATHEMATICAL & COMPUTATIONAL BIOLOGY
Ziyi Han, Yuanyuan Zhang, Lin Liu, Yulin Zhang
{"title":"UltraNet: Unleashing the Power of Simplicity for Accurate Medical Image Segmentation.","authors":"Ziyi Han, Yuanyuan Zhang, Lin Liu, Yulin Zhang","doi":"10.1007/s12539-024-00682-3","DOIUrl":null,"url":null,"abstract":"<p><p>The imperative development of point-of-care diagnosis for accurate and rapid medical image segmentation, has become increasingly urgent in recent years. Although some pioneering work has applied complex modules to improve segmentation performance, resulting models are often heavy, which is not practical for the modern clinical setting of point-of-care diagnosis. To address these challenges, we propose UltraNet, a state-of-the-art lightweight model that achieves competitive performance in segmenting multiple parts of medical images with the lowest parameters and computational complexity. To extract a sufficient amount of feature information and replace cumbersome modules, the Shallow Focus Float Block (ShalFoFo) and the Dual-stream Synergy Feature Extraction (DuSem) are respectively proposed at both shallow and deep levels. ShalFoFo is designed to capture finer-grained features containing more pixels, while DuSem is capable of extracting distinct deep semantic features from two different perspectives. By jointly utilizing them, the accuracy and stability of UltraNet segmentation results are enhanced. To evaluate performance, UltraNet's generalization ability was assessed on five datasets with different tasks. Compared to UNet, UltraNet reduces the parameters and computational complexity by 46 times and 26 times, respectively. Experimental results demonstrate that UltraNet achieves a state-of-the-art balance among parameters, computational complexity, and segmentation performance. Codes are available at https://github.com/Ziii1/UltraNet .</p>","PeriodicalId":13670,"journal":{"name":"Interdisciplinary Sciences: Computational Life Sciences","volume":" ","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2024-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interdisciplinary Sciences: Computational Life Sciences","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1007/s12539-024-00682-3","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The imperative development of point-of-care diagnosis for accurate and rapid medical image segmentation, has become increasingly urgent in recent years. Although some pioneering work has applied complex modules to improve segmentation performance, resulting models are often heavy, which is not practical for the modern clinical setting of point-of-care diagnosis. To address these challenges, we propose UltraNet, a state-of-the-art lightweight model that achieves competitive performance in segmenting multiple parts of medical images with the lowest parameters and computational complexity. To extract a sufficient amount of feature information and replace cumbersome modules, the Shallow Focus Float Block (ShalFoFo) and the Dual-stream Synergy Feature Extraction (DuSem) are respectively proposed at both shallow and deep levels. ShalFoFo is designed to capture finer-grained features containing more pixels, while DuSem is capable of extracting distinct deep semantic features from two different perspectives. By jointly utilizing them, the accuracy and stability of UltraNet segmentation results are enhanced. To evaluate performance, UltraNet's generalization ability was assessed on five datasets with different tasks. Compared to UNet, UltraNet reduces the parameters and computational complexity by 46 times and 26 times, respectively. Experimental results demonstrate that UltraNet achieves a state-of-the-art balance among parameters, computational complexity, and segmentation performance. Codes are available at https://github.com/Ziii1/UltraNet .

求助全文
约1分钟内获得全文 求助全文
来源期刊
Interdisciplinary Sciences: Computational Life Sciences
Interdisciplinary Sciences: Computational Life Sciences MATHEMATICAL & COMPUTATIONAL BIOLOGY-
CiteScore
8.60
自引率
4.20%
发文量
55
期刊介绍: Interdisciplinary Sciences--Computational Life Sciences aims to cover the most recent and outstanding developments in interdisciplinary areas of sciences, especially focusing on computational life sciences, an area that is enjoying rapid development at the forefront of scientific research and technology. The journal publishes original papers of significant general interest covering recent research and developments. Articles will be published rapidly by taking full advantage of internet technology for online submission and peer-reviewing of manuscripts, and then by publishing OnlineFirstTM through SpringerLink even before the issue is built or sent to the printer. The editorial board consists of many leading scientists with international reputation, among others, Luc Montagnier (UNESCO, France), Dennis Salahub (University of Calgary, Canada), Weitao Yang (Duke University, USA). Prof. Dongqing Wei at the Shanghai Jiatong University is appointed as the editor-in-chief; he made important contributions in bioinformatics and computational physics and is best known for his ground-breaking works on the theory of ferroelectric liquids. With the help from a team of associate editors and the editorial board, an international journal with sound reputation shall be created.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信