Gradient-Based Metrics for the Evaluation of Image Defogging

IF 2.6 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Gerard deMas-Giménez, Pablo García-Gómez, Josep R. Casas, Santiago Royo
{"title":"Gradient-Based Metrics for the Evaluation of Image Defogging","authors":"Gerard deMas-Giménez, Pablo García-Gómez, Josep R. Casas, Santiago Royo","doi":"10.3390/wevj14090254","DOIUrl":null,"url":null,"abstract":"Fog, haze, or smoke are standard atmospheric phenomena that dramatically compromise the overall visibility of any scene, critically affecting features such as the illumination, contrast, and contour detection of objects. The decrease in visibility compromises the performance of computer vision algorithms such as pattern recognition and segmentation, some of which are very relevant to decision-making in the field of autonomous vehicles. Several dehazing methods have been proposed that either need to estimate fog parameters through physical models or are statistically based. But physical parameters greatly depend on the scene conditions, and statistically based methods require large datasets of natural foggy images together with the original images without fog, i.e., the ground truth, for evaluation. Obtaining proper fog-less ground truth images for pixel-to-pixel evaluation is costly and time-consuming, and this fact hinders progress in the field. This paper aims to tackle this issue by proposing gradient-based metrics for image defogging evaluation that do not require a ground truth image without fog or a physical model. A comparison of the proposed metrics with metrics already used in the NTIRE 2018 defogging challenge as well as several state-of-the-art defogging evaluation metrics is performed to prove its effectiveness in a general situation, showing comparable results to conventional metrics and an improvement in the no-reference scene. A Matlab implementation of the proposed metrics has been developed and it is open-sourced in a public GitHub repository.","PeriodicalId":38979,"journal":{"name":"World Electric Vehicle Journal","volume":"14 1","pages":"0"},"PeriodicalIF":2.6000,"publicationDate":"2023-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Electric Vehicle Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/wevj14090254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Fog, haze, or smoke are standard atmospheric phenomena that dramatically compromise the overall visibility of any scene, critically affecting features such as the illumination, contrast, and contour detection of objects. The decrease in visibility compromises the performance of computer vision algorithms such as pattern recognition and segmentation, some of which are very relevant to decision-making in the field of autonomous vehicles. Several dehazing methods have been proposed that either need to estimate fog parameters through physical models or are statistically based. But physical parameters greatly depend on the scene conditions, and statistically based methods require large datasets of natural foggy images together with the original images without fog, i.e., the ground truth, for evaluation. Obtaining proper fog-less ground truth images for pixel-to-pixel evaluation is costly and time-consuming, and this fact hinders progress in the field. This paper aims to tackle this issue by proposing gradient-based metrics for image defogging evaluation that do not require a ground truth image without fog or a physical model. A comparison of the proposed metrics with metrics already used in the NTIRE 2018 defogging challenge as well as several state-of-the-art defogging evaluation metrics is performed to prove its effectiveness in a general situation, showing comparable results to conventional metrics and an improvement in the no-reference scene. A Matlab implementation of the proposed metrics has been developed and it is open-sourced in a public GitHub repository.
基于梯度的图像去雾评价指标
雾、霾或烟是标准的大气现象,会极大地损害任何场景的整体可见性,严重影响物体的照明、对比度和轮廓检测等特征。可见性的降低影响了计算机视觉算法的性能,例如模式识别和分割,其中一些算法与自动驾驶汽车领域的决策非常相关。人们提出了几种除雾方法,这些方法要么需要通过物理模型估计雾参数,要么需要基于统计。但是物理参数很大程度上取决于场景条件,基于统计的方法需要大量的自然有雾图像数据集以及无雾的原始图像,即地面真值进行评估。获得适当的无雾地面真值图像用于像素到像素的评估是昂贵和耗时的,这一事实阻碍了该领域的进展。本文旨在通过提出基于梯度的图像去雾评估指标来解决这个问题,该指标不需要无雾的真实图像或物理模型。将提议的指标与2018年全脱雾挑战中已经使用的指标以及几种最先进的脱雾评估指标进行了比较,以证明其在一般情况下的有效性,显示出与传统指标相当的结果,并且在无参考场景中有所改进。已经开发了一个拟议指标的Matlab实现,并在公共GitHub存储库中开源。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
World Electric Vehicle Journal
World Electric Vehicle Journal Engineering-Automotive Engineering
CiteScore
4.50
自引率
8.70%
发文量
196
审稿时长
8 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信