Baodi Liu;Jing Tian;Zhenlong Wang;Weifeng Liu;Xinan Yuan;Wei Li
{"title":"DCUGAN: Dual Contrastive Learning GAN for Unsupervised Underwater Image Enhancement","authors":"Baodi Liu;Jing Tian;Zhenlong Wang;Weifeng Liu;Xinan Yuan;Wei Li","doi":"10.23919/cje.2023.00.257","DOIUrl":null,"url":null,"abstract":"Most existing deep learning-based underwater image enhancement methods rely heavily on synthetic paired underwater images, which limits their practicality and generalization. Unsupervised underwater image enhancement methods can be trained on unpaired data, overcoming the reliance on paired data. However, existing unsupervised methods suffer from poor color correction capability, artifacts, and blurry details in the generated images. Therefore, this paper proposes a dual generative adversarial network (GAN) with contrastive learning constraints to achieve unsupervised underwater image enhancement. Firstly, we construct a dual GAN network for image transformation. Secondly, we utilize patch-based learning to maximize the mutual information between inputs and outputs, eliminating the reliance on paired data. Thirdly, we use image gradient difference loss to mitigate artifacts in the generated images. Lastly, to address the problem of blurry details, we incorporate channel attention in the generator network to focus on more important content and improve the quality of the generated images. Extensive experiments demonstrate that the enhanced results of our method show amelioration in visual quality.","PeriodicalId":50701,"journal":{"name":"Chinese Journal of Electronics","volume":"34 3","pages":"906-916"},"PeriodicalIF":1.6000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11060021","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Journal of Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11060021/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Most existing deep learning-based underwater image enhancement methods rely heavily on synthetic paired underwater images, which limits their practicality and generalization. Unsupervised underwater image enhancement methods can be trained on unpaired data, overcoming the reliance on paired data. However, existing unsupervised methods suffer from poor color correction capability, artifacts, and blurry details in the generated images. Therefore, this paper proposes a dual generative adversarial network (GAN) with contrastive learning constraints to achieve unsupervised underwater image enhancement. Firstly, we construct a dual GAN network for image transformation. Secondly, we utilize patch-based learning to maximize the mutual information between inputs and outputs, eliminating the reliance on paired data. Thirdly, we use image gradient difference loss to mitigate artifacts in the generated images. Lastly, to address the problem of blurry details, we incorporate channel attention in the generator network to focus on more important content and improve the quality of the generated images. Extensive experiments demonstrate that the enhanced results of our method show amelioration in visual quality.
期刊介绍:
CJE focuses on the emerging fields of electronics, publishing innovative and transformative research papers. Most of the papers published in CJE are from universities and research institutes, presenting their innovative research results. Both theoretical and practical contributions are encouraged, and original research papers reporting novel solutions to the hot topics in electronics are strongly recommended.