ULD-CycleGAN:用于水下图像增强的水下光场和深度图优化 CycleGAN

IF 3.8 2区 工程技术 Q1 ENGINEERING, CIVIL
Gangping Zhang;Chaofeng Li;Jiajia Yan;Yuhui Zheng
{"title":"ULD-CycleGAN:用于水下图像增强的水下光场和深度图优化 CycleGAN","authors":"Gangping Zhang;Chaofeng Li;Jiajia Yan;Yuhui Zheng","doi":"10.1109/JOE.2024.3428624","DOIUrl":null,"url":null,"abstract":"Underwater imagery frequently exhibits a multitude of degradation phenomena, including chromatic aberrations, optical blurring, and diminished contrast, thereby exacerbating the complexity of underwater endeavors. Among the existing underwater image enhancement (UIE) methods, cycle-consistent generative adversarial network (CycleGAN)-based methods rely on unpaired data sets. Based on CycleGAN, we propose an underwater light field and depth map-optimized CycleGAN (ULD-CycleGAN) for UIE. First, an underwater light field and depth maps are obtained via multiscale Gaussian filtering and the Depth-Net network. Then, they are fed into an enhanced image generator with a dual encoding subnetwork (namely, light-subnet and depth-subnet) for independent encoding. Furthermore, a depth fusion module is designed to enhance the underwater modeling information interaction between these two subnetworks and improve the underwater modeling capabilities of the image enhancement generator. Moreover, a frequency-domain loss is proposed to augment the visual aesthetics of the generated images. Extensive experimental evaluations show that our proposed methodology achieves commendable results in terms of color correction, complex scenes, and luminance, surpassing the state-of-the-art UIE methods in comprehensive qualitative and quantitative assessments. Furthermore, underwater object detection experiments are conducted to further elucidate the efficacy of our ULD-CycleGAN.","PeriodicalId":13191,"journal":{"name":"IEEE Journal of Oceanic Engineering","volume":"49 4","pages":"1275-1288"},"PeriodicalIF":3.8000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ULD-CycleGAN: An Underwater Light Field and Depth Map-Optimized CycleGAN for Underwater Image Enhancement\",\"authors\":\"Gangping Zhang;Chaofeng Li;Jiajia Yan;Yuhui Zheng\",\"doi\":\"10.1109/JOE.2024.3428624\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Underwater imagery frequently exhibits a multitude of degradation phenomena, including chromatic aberrations, optical blurring, and diminished contrast, thereby exacerbating the complexity of underwater endeavors. Among the existing underwater image enhancement (UIE) methods, cycle-consistent generative adversarial network (CycleGAN)-based methods rely on unpaired data sets. Based on CycleGAN, we propose an underwater light field and depth map-optimized CycleGAN (ULD-CycleGAN) for UIE. First, an underwater light field and depth maps are obtained via multiscale Gaussian filtering and the Depth-Net network. Then, they are fed into an enhanced image generator with a dual encoding subnetwork (namely, light-subnet and depth-subnet) for independent encoding. Furthermore, a depth fusion module is designed to enhance the underwater modeling information interaction between these two subnetworks and improve the underwater modeling capabilities of the image enhancement generator. Moreover, a frequency-domain loss is proposed to augment the visual aesthetics of the generated images. Extensive experimental evaluations show that our proposed methodology achieves commendable results in terms of color correction, complex scenes, and luminance, surpassing the state-of-the-art UIE methods in comprehensive qualitative and quantitative assessments. Furthermore, underwater object detection experiments are conducted to further elucidate the efficacy of our ULD-CycleGAN.\",\"PeriodicalId\":13191,\"journal\":{\"name\":\"IEEE Journal of Oceanic Engineering\",\"volume\":\"49 4\",\"pages\":\"1275-1288\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2024-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Oceanic Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10647108/\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, CIVIL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Oceanic Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10647108/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

摘要

水下图像经常出现色差、光学模糊和对比度降低等多种劣化现象,从而加剧了水下工作的复杂性。在现有的水下图像增强(UIE)方法中,基于循环一致性生成对抗网络(CycleGAN)的方法依赖于非配对数据集。基于 CycleGAN,我们提出了一种用于水下图像增强的水下光场和深度图优化 CycleGAN(ULD-CycleGAN)。首先,通过多尺度高斯滤波和深度网网络获得水下光场和深度图。然后,将它们输入带有双编码子网络(即光子网络和深度子网络)的增强图像生成器,进行独立编码。此外,还设计了一个深度融合模块,以加强这两个子网络之间的水下建模信息交互,提高图像增强生成器的水下建模能力。此外,还提出了一种频域损失,以增强生成图像的视觉美感。广泛的实验评估表明,我们提出的方法在色彩校正、复杂场景和亮度方面取得了值得称赞的结果,在综合定性和定量评估方面超越了最先进的 UIE 方法。此外,我们还进行了水下物体检测实验,进一步阐明了 ULD-CycleGAN 的功效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
ULD-CycleGAN: An Underwater Light Field and Depth Map-Optimized CycleGAN for Underwater Image Enhancement
Underwater imagery frequently exhibits a multitude of degradation phenomena, including chromatic aberrations, optical blurring, and diminished contrast, thereby exacerbating the complexity of underwater endeavors. Among the existing underwater image enhancement (UIE) methods, cycle-consistent generative adversarial network (CycleGAN)-based methods rely on unpaired data sets. Based on CycleGAN, we propose an underwater light field and depth map-optimized CycleGAN (ULD-CycleGAN) for UIE. First, an underwater light field and depth maps are obtained via multiscale Gaussian filtering and the Depth-Net network. Then, they are fed into an enhanced image generator with a dual encoding subnetwork (namely, light-subnet and depth-subnet) for independent encoding. Furthermore, a depth fusion module is designed to enhance the underwater modeling information interaction between these two subnetworks and improve the underwater modeling capabilities of the image enhancement generator. Moreover, a frequency-domain loss is proposed to augment the visual aesthetics of the generated images. Extensive experimental evaluations show that our proposed methodology achieves commendable results in terms of color correction, complex scenes, and luminance, surpassing the state-of-the-art UIE methods in comprehensive qualitative and quantitative assessments. Furthermore, underwater object detection experiments are conducted to further elucidate the efficacy of our ULD-CycleGAN.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Journal of Oceanic Engineering
IEEE Journal of Oceanic Engineering 工程技术-工程:大洋
CiteScore
9.60
自引率
12.20%
发文量
86
审稿时长
12 months
期刊介绍: The IEEE Journal of Oceanic Engineering (ISSN 0364-9059) is the online-only quarterly publication of the IEEE Oceanic Engineering Society (IEEE OES). The scope of the Journal is the field of interest of the IEEE OES, which encompasses all aspects of science, engineering, and technology that address research, development, and operations pertaining to all bodies of water. This includes the creation of new capabilities and technologies from concept design through prototypes, testing, and operational systems to sense, explore, understand, develop, use, and responsibly manage natural resources.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信