Minjie Wan , Bolin Chen , Yunkai Xu , Pengqiang Ge , Xiaofang Kong , Guohua Gu , Qian Chen
{"title":"A dense optical flow map-guided convolutional neural network for multi-exposure image fusion in dynamic scenes","authors":"Minjie Wan , Bolin Chen , Yunkai Xu , Pengqiang Ge , Xiaofang Kong , Guohua Gu , Qian Chen","doi":"10.1016/j.optlastec.2025.114086","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, multi-exposure image fusion-based high dynamic range (HDR) imaging technology has attracted widespread attention in optical instrumentation and measurement-based fields such as 3D measurement, industrial welding, and biological testing. However, existing methods are prone to producing ghosting artifacts when handling dynamic scenes, which results in significant limitations in their applicability. To address this issue, we propose a dense optical flow map-guided convolutional neural network for multi-exposure image fusion method in dynamic scenes, called DOFM-HDRNet, in this paper. This network leverages an attention module guided by dense optical flow maps between non-reference and reference images, which helps to highlight and retain complementary features of the reference image during feature extraction, while suppressing motion and overexposed regions and enhancing image details. Additionally, the fusion module employs dilated residual dense blocks (DRDBs) to expand the receptive field, enabling more accurate estimation of regions with missing details caused by motion or overexposure. Qualitative and quantitative experiments demonstrate that, compared with existing methods, the proposed approach achieves improvements of 0.40 dB in PSNR-u and 0.048 % in HDR-VDP-2 on the Kalantari dataset, and exhibits strong suppression capability of ghosting artifacts in HDR imaging for dynamic scenes.</div></div>","PeriodicalId":19511,"journal":{"name":"Optics and Laser Technology","volume":"192 ","pages":"Article 114086"},"PeriodicalIF":5.0000,"publicationDate":"2025-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics and Laser Technology","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0030399225016779","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, multi-exposure image fusion-based high dynamic range (HDR) imaging technology has attracted widespread attention in optical instrumentation and measurement-based fields such as 3D measurement, industrial welding, and biological testing. However, existing methods are prone to producing ghosting artifacts when handling dynamic scenes, which results in significant limitations in their applicability. To address this issue, we propose a dense optical flow map-guided convolutional neural network for multi-exposure image fusion method in dynamic scenes, called DOFM-HDRNet, in this paper. This network leverages an attention module guided by dense optical flow maps between non-reference and reference images, which helps to highlight and retain complementary features of the reference image during feature extraction, while suppressing motion and overexposed regions and enhancing image details. Additionally, the fusion module employs dilated residual dense blocks (DRDBs) to expand the receptive field, enabling more accurate estimation of regions with missing details caused by motion or overexposure. Qualitative and quantitative experiments demonstrate that, compared with existing methods, the proposed approach achieves improvements of 0.40 dB in PSNR-u and 0.048 % in HDR-VDP-2 on the Kalantari dataset, and exhibits strong suppression capability of ghosting artifacts in HDR imaging for dynamic scenes.
期刊介绍:
Optics & Laser Technology aims to provide a vehicle for the publication of a broad range of high quality research and review papers in those fields of scientific and engineering research appertaining to the development and application of the technology of optics and lasers. Papers describing original work in these areas are submitted to rigorous refereeing prior to acceptance for publication.
The scope of Optics & Laser Technology encompasses, but is not restricted to, the following areas:
•development in all types of lasers
•developments in optoelectronic devices and photonics
•developments in new photonics and optical concepts
•developments in conventional optics, optical instruments and components
•techniques of optical metrology, including interferometry and optical fibre sensors
•LIDAR and other non-contact optical measurement techniques, including optical methods in heat and fluid flow
•applications of lasers to materials processing, optical NDT display (including holography) and optical communication
•research and development in the field of laser safety including studies of hazards resulting from the applications of lasers (laser safety, hazards of laser fume)
•developments in optical computing and optical information processing
•developments in new optical materials
•developments in new optical characterization methods and techniques
•developments in quantum optics
•developments in light assisted micro and nanofabrication methods and techniques
•developments in nanophotonics and biophotonics
•developments in imaging processing and systems