Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang
{"title":"DPF-Net:嵌入数据驱动的水下图像增强物理成像模型","authors":"Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang","doi":"10.1016/j.isprsjprs.2025.07.031","DOIUrl":null,"url":null,"abstract":"Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: <ce:inter-ref xlink:href=\"https://github.com/OUCVisionGroup/DPF-Net\" xlink:type=\"simple\">https://github.com/OUCVisionGroup/DPF-Net</ce:inter-ref>.","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"17 1","pages":""},"PeriodicalIF":12.2000,"publicationDate":"2025-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DPF-Net: Physical imaging model embedded data-driven underwater image enhancement\",\"authors\":\"Han Mei, Kunqian Li, Shuaixin Liu, Chengzhi Ma, Qianli Jiang\",\"doi\":\"10.1016/j.isprsjprs.2025.07.031\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: <ce:inter-ref xlink:href=\\\"https://github.com/OUCVisionGroup/DPF-Net\\\" xlink:type=\\\"simple\\\">https://github.com/OUCVisionGroup/DPF-Net</ce:inter-ref>.\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"17 1\",\"pages\":\"\"},\"PeriodicalIF\":12.2000,\"publicationDate\":\"2025-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1016/j.isprsjprs.2025.07.031\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1016/j.isprsjprs.2025.07.031","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
DPF-Net: Physical imaging model embedded data-driven underwater image enhancement
Due to the complex interplay of light absorption and scattering in the underwater environment, underwater images experience significant degradation. This research presents a two-stage underwater image enhancement network called the Data-Driven and Physical Parameters Fusion Network (DPF-Net), which harnesses the robustness of physical imaging models alongside the generality and efficiency of data-driven methods. We train the Degraded Parameters Estimation Module (DPEM) on synthetic datasets with preset physical parameters as ground truth. This approach learns more authentic underwater imaging model, contrasting with prior works that directly fit raw-to-reference image mappings through the imaging equation. This module is subsequently trained in conjunction with an enhancement network, where the estimated physical parameters are integrated into a data-driven model within the embedding space. During model training, in addition to traditional reference-based losses, we use a degradation consistency loss to ensure physical consistency. Furthermore, we propose a new weak reference loss term that leverages the color distribution of the entire training set, thereby alleviating the reliance of our model on the quality of individual reference images. Our proposed DPF-Net demonstrates superior performance compared to other benchmark methods across multiple test sets, achieving state-of-the-art results. The source code and pre-trained models are available on the project home page: https://github.com/OUCVisionGroup/DPF-Net.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.