Real-time and universal network for volumetric imaging from microscale to macroscale at high resolution

IF 20.6 Q1 OPTICS
Bingzhi Lin, Feng Xing, Liwei Su, Kekuan Wang, Yulan Liu, Diming Zhang, Xusan Yang, Huijun Tan, Zhijing Zhu, Depeng Wang
{"title":"Real-time and universal network for volumetric imaging from microscale to macroscale at high resolution","authors":"Bingzhi Lin, Feng Xing, Liwei Su, Kekuan Wang, Yulan Liu, Diming Zhang, Xusan Yang, Huijun Tan, Zhijing Zhu, Depeng Wang","doi":"10.1038/s41377-025-01842-w","DOIUrl":null,"url":null,"abstract":"<p>Light-field imaging has wide applications in various domains, including microscale life science imaging, mesoscale neuroimaging, and macroscale fluid dynamics imaging. The development of deep learning-based reconstruction methods has greatly facilitated high-resolution light-field image processing, however, current deep learning-based light-field reconstruction methods have predominantly concentrated on the microscale. Considering the multiscale imaging capacity of light-field technique, a network that can work over variant scales of light-field image reconstruction will significantly benefit the development of volumetric imaging. Unfortunately, to our knowledge, no one has reported a universal high-resolution light-field image reconstruction algorithm that is compatible with microscale, mesoscale, and macroscale. To fill this gap, we present a real-time and universal network (RTU-Net) to reconstruct high-resolution light-field images at any scale. RTU-Net, as the first network that works over multiscale light-field image reconstruction, employs an adaptive loss function based on generative adversarial theory and consequently exhibits strong generalization capability. We comprehensively assessed the performance of RTU-Net through the reconstruction of multiscale light-field images, including microscale tubulin and mitochondrion dataset, mesoscale synthetic mouse neuro dataset, and macroscale light-field particle imaging velocimetry dataset. The results indicated that RTU-Net has achieved real-time and high-resolution light-field image reconstruction for volume sizes ranging from 300 μm × 300 μm × 12 μm to 25 mm × 25 mm × 25 mm, and demonstrated higher resolution when compared with recently reported light-field reconstruction networks. The high-resolution, strong robustness, high efficiency, and especially the general applicability of RTU-Net will significantly deepen our insight into high-resolution and volumetric imaging.</p>","PeriodicalId":18069,"journal":{"name":"Light-Science & Applications","volume":"22 1","pages":""},"PeriodicalIF":20.6000,"publicationDate":"2025-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Light-Science & Applications","FirstCategoryId":"1089","ListUrlMain":"https://doi.org/10.1038/s41377-025-01842-w","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Light-field imaging has wide applications in various domains, including microscale life science imaging, mesoscale neuroimaging, and macroscale fluid dynamics imaging. The development of deep learning-based reconstruction methods has greatly facilitated high-resolution light-field image processing, however, current deep learning-based light-field reconstruction methods have predominantly concentrated on the microscale. Considering the multiscale imaging capacity of light-field technique, a network that can work over variant scales of light-field image reconstruction will significantly benefit the development of volumetric imaging. Unfortunately, to our knowledge, no one has reported a universal high-resolution light-field image reconstruction algorithm that is compatible with microscale, mesoscale, and macroscale. To fill this gap, we present a real-time and universal network (RTU-Net) to reconstruct high-resolution light-field images at any scale. RTU-Net, as the first network that works over multiscale light-field image reconstruction, employs an adaptive loss function based on generative adversarial theory and consequently exhibits strong generalization capability. We comprehensively assessed the performance of RTU-Net through the reconstruction of multiscale light-field images, including microscale tubulin and mitochondrion dataset, mesoscale synthetic mouse neuro dataset, and macroscale light-field particle imaging velocimetry dataset. The results indicated that RTU-Net has achieved real-time and high-resolution light-field image reconstruction for volume sizes ranging from 300 μm × 300 μm × 12 μm to 25 mm × 25 mm × 25 mm, and demonstrated higher resolution when compared with recently reported light-field reconstruction networks. The high-resolution, strong robustness, high efficiency, and especially the general applicability of RTU-Net will significantly deepen our insight into high-resolution and volumetric imaging.

Abstract Image

实时和通用网络的体积成像,从微观尺度到宏观尺度,在高分辨率
光场成像在微观尺度生命科学成像、中尺度神经成像、宏观尺度流体动力学成像等领域有着广泛的应用。基于深度学习的重建方法的发展极大地促进了高分辨率光场图像的处理,然而,目前基于深度学习的光场重建方法主要集中在微观尺度上。考虑到光场技术的多尺度成像能力,一个能够在不同尺度上进行光场图像重建的网络将极大地促进体成像技术的发展。不幸的是,据我们所知,还没有人报道过一种兼容微尺度、中尺度和宏观尺度的通用高分辨率光场图像重建算法。为了填补这一空白,我们提出了一个实时通用网络(RTU-Net)来重建任何尺度的高分辨率光场图像。RTU-Net作为首个实现多尺度光场图像重建的网络,采用了基于生成对抗理论的自适应损失函数,具有较强的泛化能力。通过重建多尺度光场图像,包括微尺度微管蛋白和线粒体数据集、中尺度合成小鼠神经数据集和宏观尺度光场粒子成像测速数据集,对RTU-Net的性能进行了综合评估。结果表明,RTU-Net可以实现300 μm × 300 μm × 12 μm到25 mm × 25 mm × 25 mm的实时高分辨率光场图像重建,与目前报道的光场重建网络相比,具有更高的分辨率。RTU-Net的高分辨率、强鲁棒性、高效率,特别是其通用性将极大地深化我们对高分辨率和体积成像的认识。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Light-Science & Applications
Light-Science & Applications 数理科学, 物理学I, 光学, 凝聚态物性 II :电子结构、电学、磁学和光学性质, 无机非金属材料, 无机非金属类光电信息与功能材料, 工程与材料, 信息科学, 光学和光电子学, 光学和光电子材料, 非线性光学与量子光学
自引率
0.00%
发文量
803
审稿时长
2.1 months
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信