DPF-Net:嵌入数据驱动的水下图像增强物理成像模型

IF 12.2 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang
{"title":"DPF-Net:嵌入数据驱动的水下图像增强物理成像模型","authors":"Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang","doi":"10.1016/j.isprsjprs.2025.07.031","DOIUrl":null,"url":null,"abstract":"Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: <ce:inter-ref xlink:href=\"https://github.com/OUCVisionGroup/DPF-Net\" xlink:type=\"simple\">https://github.com/OUCVisionGroup/DPF-Net</ce:inter-ref>.","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"17 1","pages":""},"PeriodicalIF":12.2000,"publicationDate":"2025-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DPF-Net: Physical imaging model embedded data-driven underwater image enhancement\",\"authors\":\"Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang\",\"doi\":\"10.1016/j.isprsjprs.2025.07.031\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: <ce:inter-ref xlink:href=\\\"https://github.com/OUCVisionGroup/DPF-Net\\\" xlink:type=\\\"simple\\\">https://github.com/OUCVisionGroup/DPF-Net</ce:inter-ref>.\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"17 1\",\"pages\":\"\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1016/j.isprsjprs.2025.07.031\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1016/j.isprsjprs.2025.07.031","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

摘要

由于水下环境中光的吸收和散射的复杂相互作用,水下图像会出现明显的退化。本研究提出了一种两阶段水下图像增强网络,称为数据驱动和物理参数融合网络(DPF-Net),它利用了物理成像模型的鲁棒性以及数据驱动方法的通用性和效率。我们在合成数据集上训练退化参数估计模块(DPEM),并将预设的物理参数作为真值。与以往通过成像方程直接拟合原始到参考图像映射的方法相比,该方法学习了更真实的水下成像模型。该模块随后与增强网络一起进行训练,其中估计的物理参数被集成到嵌入空间内的数据驱动模型中。在模型训练过程中,除了传统的基于参考的损失外,我们还使用退化一致性损失来确保物理一致性。此外,我们提出了一个新的弱参考损失项,它利用了整个训练集的颜色分布,从而减轻了我们的模型对单个参考图像质量的依赖。与其他基准测试方法相比,我们提出的DPF-Net在多个测试集上表现出卓越的性能,获得了最先进的结果。源代码和预训练模型可在项目主页上获得:https://github.com/OUCVisionGroup/DPF-Net。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
DPF-Net: Physical imaging model embedded data-driven underwater image enhancement
Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: https://github.com/OUCVisionGroup/DPF-Net.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信