{"title":"BGFlow: Brightness-guided normalizing flow for low-light image enhancement","authors":"Jiale Chen, Qiusheng Lian, Baoshun Shi","doi":"10.1016/j.displa.2024.102863","DOIUrl":null,"url":null,"abstract":"<div><div>Low-light image enhancement poses significant challenges due to its ill-posed nature. Recently, deep learning-based methods have attempted to establish a unified mapping relationship between normal-light images and their low-light versions but frequently struggle to capture the intricate variations in brightness conditions. As a result, these methods often suffer from overexposure, underexposure, amplified noise, and distorted colors. To tackle these issues, we propose a brightness-guided normalizing flow framework, dubbed BGFlow, for low-light image enhancement. Specifically, we recognize that low-frequency sub-bands in the wavelet domain carry significant brightness information. To effectively capture the intricate variations in brightness within an image, we design a transformer-based multi-scale wavelet-domain encoder to extract brightness information from the multi-scale features of the low-frequency sub-bands. The extracted brightness feature maps, at different scales, are then injected into the brightness-guided affine coupling layer to guide the training of the conditional normalizing flow module. Extensive experimental evaluations demonstrate the superiority of BGFlow over existing deep learning-based approaches in both qualitative and quantitative assessments. Moreover, we also showcase the exceptional performance of BGFlow on the underwater image enhancement task.</div></div>","PeriodicalId":50570,"journal":{"name":"Displays","volume":"85 ","pages":"Article 102863"},"PeriodicalIF":3.7000,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Displays","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0141938224002270","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Low-light image enhancement poses significant challenges due to its ill-posed nature. Recently, deep learning-based methods have attempted to establish a unified mapping relationship between normal-light images and their low-light versions but frequently struggle to capture the intricate variations in brightness conditions. As a result, these methods often suffer from overexposure, underexposure, amplified noise, and distorted colors. To tackle these issues, we propose a brightness-guided normalizing flow framework, dubbed BGFlow, for low-light image enhancement. Specifically, we recognize that low-frequency sub-bands in the wavelet domain carry significant brightness information. To effectively capture the intricate variations in brightness within an image, we design a transformer-based multi-scale wavelet-domain encoder to extract brightness information from the multi-scale features of the low-frequency sub-bands. The extracted brightness feature maps, at different scales, are then injected into the brightness-guided affine coupling layer to guide the training of the conditional normalizing flow module. Extensive experimental evaluations demonstrate the superiority of BGFlow over existing deep learning-based approaches in both qualitative and quantitative assessments. Moreover, we also showcase the exceptional performance of BGFlow on the underwater image enhancement task.
期刊介绍:
Displays is the international journal covering the research and development of display technology, its effective presentation and perception of information, and applications and systems including display-human interface.
Technical papers on practical developments in Displays technology provide an effective channel to promote greater understanding and cross-fertilization across the diverse disciplines of the Displays community. Original research papers solving ergonomics issues at the display-human interface advance effective presentation of information. Tutorial papers covering fundamentals intended for display technologies and human factor engineers new to the field will also occasionally featured.