DCUGAN: Dual Contrastive Learning GAN for Unsupervised Underwater Image Enhancement

IF 1.6 4区 计算机科学 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Baodi Liu;Jing Tian;Zhenlong Wang;Weifeng Liu;Xinan Yuan;Wei Li
{"title":"DCUGAN: Dual Contrastive Learning GAN for Unsupervised Underwater Image Enhancement","authors":"Baodi Liu;Jing Tian;Zhenlong Wang;Weifeng Liu;Xinan Yuan;Wei Li","doi":"10.23919/cje.2023.00.257","DOIUrl":null,"url":null,"abstract":"Most existing deep learning-based underwater image enhancement methods rely heavily on synthetic paired underwater images, which limits their practicality and generalization. Unsupervised underwater image enhancement methods can be trained on unpaired data, overcoming the reliance on paired data. However, existing unsupervised methods suffer from poor color correction capability, artifacts, and blurry details in the generated images. Therefore, this paper proposes a dual generative adversarial network (GAN) with contrastive learning constraints to achieve unsupervised underwater image enhancement. Firstly, we construct a dual GAN network for image transformation. Secondly, we utilize patch-based learning to maximize the mutual information between inputs and outputs, eliminating the reliance on paired data. Thirdly, we use image gradient difference loss to mitigate artifacts in the generated images. Lastly, to address the problem of blurry details, we incorporate channel attention in the generator network to focus on more important content and improve the quality of the generated images. Extensive experiments demonstrate that the enhanced results of our method show amelioration in visual quality.","PeriodicalId":50701,"journal":{"name":"Chinese Journal of Electronics","volume":"34 3","pages":"906-916"},"PeriodicalIF":1.6000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11060021","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Journal of Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11060021/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Most existing deep learning-based underwater image enhancement methods rely heavily on synthetic paired underwater images, which limits their practicality and generalization. Unsupervised underwater image enhancement methods can be trained on unpaired data, overcoming the reliance on paired data. However, existing unsupervised methods suffer from poor color correction capability, artifacts, and blurry details in the generated images. Therefore, this paper proposes a dual generative adversarial network (GAN) with contrastive learning constraints to achieve unsupervised underwater image enhancement. Firstly, we construct a dual GAN network for image transformation. Secondly, we utilize patch-based learning to maximize the mutual information between inputs and outputs, eliminating the reliance on paired data. Thirdly, we use image gradient difference loss to mitigate artifacts in the generated images. Lastly, to address the problem of blurry details, we incorporate channel attention in the generator network to focus on more important content and improve the quality of the generated images. Extensive experiments demonstrate that the enhanced results of our method show amelioration in visual quality.
用于无监督水下图像增强的双对比学习GAN
现有的基于深度学习的水下图像增强方法大多依赖于合成的水下图像配对,这限制了它们的实用性和泛化性。无监督水下图像增强方法可以在非成对数据上进行训练,克服了对成对数据的依赖。然而,现有的无监督方法存在颜色校正能力差、伪影和生成图像细节模糊等问题。因此,本文提出了一种具有对比学习约束的双生成对抗网络(GAN)来实现无监督水下图像增强。首先,我们构建了一个对偶GAN网络用于图像变换。其次,我们利用基于patch的学习来最大化输入和输出之间的互信息,消除对成对数据的依赖。第三,我们利用图像梯度差损失来减轻生成图像中的伪影。最后,为了解决细节模糊的问题,我们在生成器网络中加入了通道关注,以关注更重要的内容,提高生成图像的质量。大量的实验表明,该方法的增强结果显示出视觉质量的改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Chinese Journal of Electronics
Chinese Journal of Electronics 工程技术-工程:电子与电气
CiteScore
3.70
自引率
16.70%
发文量
342
审稿时长
12.0 months
期刊介绍: CJE focuses on the emerging fields of electronics, publishing innovative and transformative research papers. Most of the papers published in CJE are from universities and research institutes, presenting their innovative research results. Both theoretical and practical contributions are encouraged, and original research papers reporting novel solutions to the hot topics in electronics are strongly recommended.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信