Alex Ling Yu Hung, Kai Zhao, Kaifeng Pang, Qi Miao, Zhaozhi Wang, Wayne Brisbane, Demetri Terzopoulos, Kyunghyun Sung
{"title":"MINIMALLY USER-GUIDED 3D MICRO-ULTRASOUND PROSTATE SEGMENTATION.","authors":"Alex Ling Yu Hung, Kai Zhao, Kaifeng Pang, Qi Miao, Zhaozhi Wang, Wayne Brisbane, Demetri Terzopoulos, Kyunghyun Sung","doi":"10.1109/isbi60581.2025.10981266","DOIUrl":null,"url":null,"abstract":"<p><p>Micro-ultrasound is an emerging imaging tool that complements MRI in detecting prostate cancer by offering high-resolution imaging at lower cost. However, reliable annotations for micro-ultrasound data remain challenging due to the limited availability of experts and a steep learning curve. To address the clear clinical need, we propose a click-based, user-guided volumetric micro-ultrasound prostate segmentation model requiring minimal user intervention and training data. Our model predicts the segmentation of the entire prostate volume after users place a few points on the two boundary image slices of the prostate. Experiments show that the model needs only a small amount of training data to achieve strong segmentation performance, with each of its components contributing to its overall improvement. We demonstrate that the level of expertise of the user scarcely affects performance. This makes prostate segmentation practically feasible for general users.</p>","PeriodicalId":74566,"journal":{"name":"Proceedings. IEEE International Symposium on Biomedical Imaging","volume":"2025 ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12104093/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. IEEE International Symposium on Biomedical Imaging","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/isbi60581.2025.10981266","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/5/12 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Micro-ultrasound is an emerging imaging tool that complements MRI in detecting prostate cancer by offering high-resolution imaging at lower cost. However, reliable annotations for micro-ultrasound data remain challenging due to the limited availability of experts and a steep learning curve. To address the clear clinical need, we propose a click-based, user-guided volumetric micro-ultrasound prostate segmentation model requiring minimal user intervention and training data. Our model predicts the segmentation of the entire prostate volume after users place a few points on the two boundary image slices of the prostate. Experiments show that the model needs only a small amount of training data to achieve strong segmentation performance, with each of its components contributing to its overall improvement. We demonstrate that the level of expertise of the user scarcely affects performance. This makes prostate segmentation practically feasible for general users.