Semi-Supervised Underwater Image Enhancement Network Boosted by Depth Map Consistency

IF 3.8 2区 工程技术 Q1 ENGINEERING, CIVIL
Fengqi Xiao;Jiahui Liu;Yifan Huang;En Cheng;Fei Yuan
{"title":"Semi-Supervised Underwater Image Enhancement Network Boosted by Depth Map Consistency","authors":"Fengqi Xiao;Jiahui Liu;Yifan Huang;En Cheng;Fei Yuan","doi":"10.1109/JOE.2024.3487350","DOIUrl":null,"url":null,"abstract":"Underwater optical images are a crucial information source for autonomous underwater vehicles during underwater development and exploration. When these vehicles are in operation, they need to capture high-quality images for information extraction and analysis and require depth map information from the underwater scene to maintain the vehicle's posture balance, obstacle avoidance, and navigation. However, the absorption and scattering of light in water result in low-quality underwater images, significantly affecting the execution of these tasks. In response to these challenges, this article proposes a physically guided, semi-supervised dual-loop network for underwater image enhancement. This network is designed to accomplish high-quality underwater image enhancement and depth map estimation simultaneously. First, the revised underwater image formation model is employed to guide a two-stage network in decomposing and reconstructing underwater images. The depth map consistency of the scene and piecewise cycle consistency loss are utilized to ensure the reliability of the image transformation process. In another loop, a self-augmentation module based on inherent optical properties is introduced to enhance the robustness of the decomposition network. A multimodal discriminator is incorporated to form piecewise adversarial loss to improve the visual quality of the images. Through extensive experimental evaluation and analysis, the proposed method not only demonstrates outstanding performance in underwater image enhancement and depth map estimation but also reveals the relationships between various physical quantities during the degradation process of underwater images, enhancing the physical interpretability of the neural network.","PeriodicalId":13191,"journal":{"name":"IEEE Journal of Oceanic Engineering","volume":"50 2","pages":"795-816"},"PeriodicalIF":3.8000,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Oceanic Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10879147/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

Abstract

Underwater optical images are a crucial information source for autonomous underwater vehicles during underwater development and exploration. When these vehicles are in operation, they need to capture high-quality images for information extraction and analysis and require depth map information from the underwater scene to maintain the vehicle's posture balance, obstacle avoidance, and navigation. However, the absorption and scattering of light in water result in low-quality underwater images, significantly affecting the execution of these tasks. In response to these challenges, this article proposes a physically guided, semi-supervised dual-loop network for underwater image enhancement. This network is designed to accomplish high-quality underwater image enhancement and depth map estimation simultaneously. First, the revised underwater image formation model is employed to guide a two-stage network in decomposing and reconstructing underwater images. The depth map consistency of the scene and piecewise cycle consistency loss are utilized to ensure the reliability of the image transformation process. In another loop, a self-augmentation module based on inherent optical properties is introduced to enhance the robustness of the decomposition network. A multimodal discriminator is incorporated to form piecewise adversarial loss to improve the visual quality of the images. Through extensive experimental evaluation and analysis, the proposed method not only demonstrates outstanding performance in underwater image enhancement and depth map estimation but also reveals the relationships between various physical quantities during the degradation process of underwater images, enhancing the physical interpretability of the neural network.
基于深度图一致性的半监督水下图像增强网络
水下光学图像是自主水下航行器在水下开发和探测过程中的重要信息来源。这些车辆在运行时,需要捕获高质量的图像进行信息提取和分析,并需要来自水下场景的深度图信息来维持车辆的姿态平衡,避障和导航。然而,水对光的吸收和散射导致水下图像质量较低,严重影响了这些任务的执行。针对这些挑战,本文提出了一种用于水下图像增强的物理引导、半监督双环网络。该网络旨在同时实现高质量的水下图像增强和深度图估计。首先,利用修正后的水下图像形成模型,指导两阶段网络对水下图像进行分解和重构。利用景深图的一致性和分段循环一致性损失来保证图像变换过程的可靠性。在另一个环路中,引入了基于固有光学特性的自增强模块来增强分解网络的鲁棒性。采用多模态判别器形成分段对抗损失,提高图像的视觉质量。通过大量的实验评估和分析,该方法不仅在水下图像增强和深度图估计方面表现出优异的性能,而且揭示了水下图像退化过程中各种物理量之间的关系,增强了神经网络的物理可解释性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Journal of Oceanic Engineering
IEEE Journal of Oceanic Engineering 工程技术-工程:大洋
CiteScore
9.60
自引率
12.20%
发文量
86
审稿时长
12 months
期刊介绍: The IEEE Journal of Oceanic Engineering (ISSN 0364-9059) is the online-only quarterly publication of the IEEE Oceanic Engineering Society (IEEE OES). The scope of the Journal is the field of interest of the IEEE OES, which encompasses all aspects of science, engineering, and technology that address research, development, and operations pertaining to all bodies of water. This includes the creation of new capabilities and technologies from concept design through prototypes, testing, and operational systems to sense, explore, understand, develop, use, and responsibly manage natural resources.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信