CycleGAN-TMDE: An Image Dehazing Model Using Cycle Generative Adversarial Network with Transmission Map and Depth Estimation

IF 3 4区 计算机科学 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Chinese Journal of Electronics Pub Date : 2026-01-01 Epub Date: 2026-04-13 DOI:10.23919/cje.2024.00.141
Xinlai Guo;Yanyun Tao;Yuzhen Zhang;Xu Biao;Jianying Zheng;Guang Ji
{"title":"CycleGAN-TMDE: An Image Dehazing Model Using Cycle Generative Adversarial Network with Transmission Map and Depth Estimation","authors":"Xinlai Guo;Yanyun Tao;Yuzhen Zhang;Xu Biao;Jianying Zheng;Guang Ji","doi":"10.23919/cje.2024.00.141","DOIUrl":null,"url":null,"abstract":"Hazy conditions significantly reduce image contrast and obscure object boundaries, impairing the performance of vision-based tasks such as object detection, tracking, and scene understanding. Learning-based de-hazing methods have attained numerous achievements in dehazing images. For real-world haze images, the current methods result in the poor quality of haze-free images. In this study, we propose an image dehazing method based on cycle generative adversarial network (CycleGAN), which integrates the transmission map and depth estimation (CycleGAN-TMDE). In CycleGAN-TMDE, we designed a dehaze generator that includes a transmission map estimator and an atmospheric scattering model to produce haze-free images with real-world physical characteristics. To further improve the dehaze generator's dehazing capability, we adopt a depth estimator to generate haze images while simultaneously using the dehaze generator to remove haze from these generated images. The cycle loss function compensates for the absence of matched hazy sample pairs in unsupervised learning. The adaptive loss function enhances the model's robustness, ensuring that when a haze-free image is used as input, Cycle-GAN-TMDE can produce similarly clear outputs. On the real-world hazy images of the realistic single image de-hazing (RESIDE) dataset, CycleGAN-TMDE achieves clearer and more natural haze-free images, particularly producing better visual effects for distant scenery while also yielding favorable no-reference image quality assessment metrics. On the synthetic hazy datasets RESIDE and Haze4k, CycleGAN-TMDE can restore high-quality haze-free images while achieving comparable peak signal-to-noise ratio and structural similarity index values to supervised learning methods and outperforms other unsupervised learning methods.","PeriodicalId":50701,"journal":{"name":"Chinese Journal of Electronics","volume":"35 1","pages":"377-391"},"PeriodicalIF":3.0000,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480061","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Journal of Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11480061/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/4/13 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Hazy conditions significantly reduce image contrast and obscure object boundaries, impairing the performance of vision-based tasks such as object detection, tracking, and scene understanding. Learning-based de-hazing methods have attained numerous achievements in dehazing images. For real-world haze images, the current methods result in the poor quality of haze-free images. In this study, we propose an image dehazing method based on cycle generative adversarial network (CycleGAN), which integrates the transmission map and depth estimation (CycleGAN-TMDE). In CycleGAN-TMDE, we designed a dehaze generator that includes a transmission map estimator and an atmospheric scattering model to produce haze-free images with real-world physical characteristics. To further improve the dehaze generator's dehazing capability, we adopt a depth estimator to generate haze images while simultaneously using the dehaze generator to remove haze from these generated images. The cycle loss function compensates for the absence of matched hazy sample pairs in unsupervised learning. The adaptive loss function enhances the model's robustness, ensuring that when a haze-free image is used as input, Cycle-GAN-TMDE can produce similarly clear outputs. On the real-world hazy images of the realistic single image de-hazing (RESIDE) dataset, CycleGAN-TMDE achieves clearer and more natural haze-free images, particularly producing better visual effects for distant scenery while also yielding favorable no-reference image quality assessment metrics. On the synthetic hazy datasets RESIDE and Haze4k, CycleGAN-TMDE can restore high-quality haze-free images while achieving comparable peak signal-to-noise ratio and structural similarity index values to supervised learning methods and outperforms other unsupervised learning methods.
基于传输映射和深度估计的循环生成对抗网络的图像去雾模型CycleGAN-TMDE
模糊条件会显著降低图像对比度,模糊物体边界,损害基于视觉的任务的性能,如物体检测、跟踪和场景理解。基于学习的去雾方法在图像去雾方面取得了许多成果。对于现实世界的雾霾图像,目前的方法导致无雾图像质量较差。在这项研究中,我们提出了一种基于循环生成对抗网络(CycleGAN)的图像去雾方法,该方法将传输图和深度估计(CycleGAN- tmde)相结合。在CycleGAN-TMDE中,我们设计了一个除霾发生器,它包括一个透射图估计器和一个大气散射模型,以产生具有现实世界物理特征的无霾图像。为了进一步提高消霾发生器的消霾能力,我们采用深度估计器生成雾霾图像,同时使用消霾发生器对生成图像中的雾霾进行去除。在无监督学习中,周期损失函数补偿了匹配模糊样本对的缺失。自适应损失函数增强了模型的鲁棒性,确保当使用无雾图像作为输入时,Cycle-GAN-TMDE可以产生类似的清晰输出。在真实单幅图像去雾(live)数据集的真实朦胧图像上,CycleGAN-TMDE获得了更清晰、更自然的无雾图像,特别是对远处风景产生更好的视觉效果,同时也产生了有利的无参考图像质量评估指标。在合成雾霾数据集live和Haze4k上,CycleGAN-TMDE可以恢复高质量的无雾图像,同时获得与监督学习方法相当的峰值信噪比和结构相似性指数值,优于其他无监督学习方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Chinese Journal of Electronics
Chinese Journal of Electronics 工程技术-工程:电子与电气
CiteScore
3.70
自引率
16.70%
发文量
342
审稿时长
12.0 months
期刊介绍: CJE focuses on the emerging fields of electronics, publishing innovative and transformative research papers. Most of the papers published in CJE are from universities and research institutes, presenting their innovative research results. Both theoretical and practical contributions are encouraged, and original research papers reporting novel solutions to the hot topics in electronics are strongly recommended.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书