Jiwei Shan;Zixin Zhang;Hao Li;Cheng-Tai Hsieh;Yirui Li;Wenhua Wu;Hesheng Wang
{"title":"UW-DNeRF:利用不确定性引导的深度监督和局部信息整合进行可变形软组织重建","authors":"Jiwei Shan;Zixin Zhang;Hao Li;Cheng-Tai Hsieh;Yirui Li;Wenhua Wu;Hesheng Wang","doi":"10.1109/TMI.2025.3550269","DOIUrl":null,"url":null,"abstract":"Reconstructing deformable soft tissues from endoscopic videos is a critical yet challenging task. Leveraging depth priors, deformable implicit neural representations have seen significant advancements in this field. However, depth priors from pre-trained depth estimation models are often coarse, and inaccurate depth supervision can severely impair the performance of these neural networks. Moreover, existing methods overlook local similarities in input sequences, which restricts their effectiveness in capturing local details and tissue deformations. In this paper, we introduce UW-DNeRF, a novel approach utilizing neural radiance fields for high-quality reconstruction of deformable tissues. We propose an uncertainty-guided depth supervision strategy to mitigate the impact of inaccurate depth information. This strategy relaxes hard depth constraints and unlocks the potential of implicit neural representations. In addition, we design a local window-based information sharing scheme. This scheme employs local window and keyframe deformation networks to construct deformations with local awareness and enhances the model’s ability to capture fine details. We demonstrate the superiority of our method over state-of-the-art approaches on synthetic and in vivo endoscopic datasets. Code is available at: <uri>https://github.com/IRMVLab/UW-DNeRF</uri>.","PeriodicalId":94033,"journal":{"name":"IEEE transactions on medical imaging","volume":"44 7","pages":"2808-2818"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"UW-DNeRF: Deformable Soft Tissue Reconstruction With Uncertainty-Guided Depth Supervision and Local Information Integration\",\"authors\":\"Jiwei Shan;Zixin Zhang;Hao Li;Cheng-Tai Hsieh;Yirui Li;Wenhua Wu;Hesheng Wang\",\"doi\":\"10.1109/TMI.2025.3550269\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Reconstructing deformable soft tissues from endoscopic videos is a critical yet challenging task. Leveraging depth priors, deformable implicit neural representations have seen significant advancements in this field. However, depth priors from pre-trained depth estimation models are often coarse, and inaccurate depth supervision can severely impair the performance of these neural networks. Moreover, existing methods overlook local similarities in input sequences, which restricts their effectiveness in capturing local details and tissue deformations. In this paper, we introduce UW-DNeRF, a novel approach utilizing neural radiance fields for high-quality reconstruction of deformable tissues. We propose an uncertainty-guided depth supervision strategy to mitigate the impact of inaccurate depth information. This strategy relaxes hard depth constraints and unlocks the potential of implicit neural representations. In addition, we design a local window-based information sharing scheme. This scheme employs local window and keyframe deformation networks to construct deformations with local awareness and enhances the model’s ability to capture fine details. We demonstrate the superiority of our method over state-of-the-art approaches on synthetic and in vivo endoscopic datasets. Code is available at: <uri>https://github.com/IRMVLab/UW-DNeRF</uri>.\",\"PeriodicalId\":94033,\"journal\":{\"name\":\"IEEE transactions on medical imaging\",\"volume\":\"44 7\",\"pages\":\"2808-2818\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-03-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on medical imaging\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10922782/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical imaging","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10922782/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
UW-DNeRF: Deformable Soft Tissue Reconstruction With Uncertainty-Guided Depth Supervision and Local Information Integration
Reconstructing deformable soft tissues from endoscopic videos is a critical yet challenging task. Leveraging depth priors, deformable implicit neural representations have seen significant advancements in this field. However, depth priors from pre-trained depth estimation models are often coarse, and inaccurate depth supervision can severely impair the performance of these neural networks. Moreover, existing methods overlook local similarities in input sequences, which restricts their effectiveness in capturing local details and tissue deformations. In this paper, we introduce UW-DNeRF, a novel approach utilizing neural radiance fields for high-quality reconstruction of deformable tissues. We propose an uncertainty-guided depth supervision strategy to mitigate the impact of inaccurate depth information. This strategy relaxes hard depth constraints and unlocks the potential of implicit neural representations. In addition, we design a local window-based information sharing scheme. This scheme employs local window and keyframe deformation networks to construct deformations with local awareness and enhances the model’s ability to capture fine details. We demonstrate the superiority of our method over state-of-the-art approaches on synthetic and in vivo endoscopic datasets. Code is available at: https://github.com/IRMVLab/UW-DNeRF.