A High Dynamic Range Image Fusion Method Based on Dual Gain Image

IF 1.8 Q3 REMOTE SENSING
Li Yuan, Wenbo Wu, Shuli Dong, Q. He, Feiran Zhang
{"title":"A High Dynamic Range Image Fusion Method Based on Dual Gain Image","authors":"Li Yuan, Wenbo Wu, Shuli Dong, Q. He, Feiran Zhang","doi":"10.1080/19479832.2022.2116492","DOIUrl":null,"url":null,"abstract":"ABSTRACT For a camera with automatic gain control, two images with high and low optical gain can be output at the same exposure time. Due to the small gain value, most of target details are hidden in the dark pixels for the low gain image, and the brightness saturation usually appears in high gain image for the high luminance areas. To obtain the essential information from the dual gain images, a generation method of high dynamic range image based on dual gain image was developed. The method is composed of five parts, including enhancement of image detail, establishment of Laplacian pyramid, selection of fusion operator, reconstruction of fusion pyramid and adjustment of image contrast. Results showed that combination of the gradient operator for N-1 layer and the neighbourhood filter operator for the Nth layer had better fusion effect. Moreover, based on the analysis of image information entropy and clarity, the fusion efficiency was calculated, and the fusion efficiency of Mertens’s method, Jiang’s method, Zhang’s method, Goshtasby’s method and the presented method was 30.5%, 33.5%, 39.5%, 51% and 99%, indicating that the HDR fusion method based on dual gain image is reliable.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"14 1","pages":"15 - 37"},"PeriodicalIF":1.8000,"publicationDate":"2022-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2022.2116492","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

ABSTRACT For a camera with automatic gain control, two images with high and low optical gain can be output at the same exposure time. Due to the small gain value, most of target details are hidden in the dark pixels for the low gain image, and the brightness saturation usually appears in high gain image for the high luminance areas. To obtain the essential information from the dual gain images, a generation method of high dynamic range image based on dual gain image was developed. The method is composed of five parts, including enhancement of image detail, establishment of Laplacian pyramid, selection of fusion operator, reconstruction of fusion pyramid and adjustment of image contrast. Results showed that combination of the gradient operator for N-1 layer and the neighbourhood filter operator for the Nth layer had better fusion effect. Moreover, based on the analysis of image information entropy and clarity, the fusion efficiency was calculated, and the fusion efficiency of Mertens’s method, Jiang’s method, Zhang’s method, Goshtasby’s method and the presented method was 30.5%, 33.5%, 39.5%, 51% and 99%, indicating that the HDR fusion method based on dual gain image is reliable.
一种基于双增益图像的高动态范围图像融合方法
摘要对于具有自动增益控制的相机,可以在相同的曝光时间输出具有高和低光学增益的两幅图像。由于增益值较小,对于低增益图像,大多数目标细节隐藏在暗像素中,而对于高亮度区域,亮度饱和通常出现在高增益图像中。为了从双增益图像中获取重要信息,提出了一种基于双增益图像的高动态范围图像生成方法。该方法由五个部分组成,包括图像细节的增强、拉普拉斯金字塔的建立、融合算子的选择、融合金字塔的重建和图像对比度的调整。结果表明,N-1层的梯度算子和第N层的邻域滤波算子相结合具有较好的融合效果。此外,基于图像信息熵和清晰度的分析,计算了融合效率,Mertens方法、Jiang方法、Zhang方法、Goshtasby方法和所提出的方法的融合效率分别为30.5%、33.5%、39.5%、51%和99%,表明基于双增益图像的HDR融合方法是可靠的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.00
自引率
0.00%
发文量
10
期刊介绍: International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信