Deep Color Constancy Using Spatio-Temporal Correlation of High-Speed Video

Dong-Jae Lee, Kang-Kyu Lee, Jong-Ok Kim
{"title":"Deep Color Constancy Using Spatio-Temporal Correlation of High-Speed Video","authors":"Dong-Jae Lee, Kang-Kyu Lee, Jong-Ok Kim","doi":"10.1109/VCIP53242.2021.9675406","DOIUrl":null,"url":null,"abstract":"After the invention of electric bulbs, most of lights surrounding our worlds are powered by alternative current (AC). This intensity variation can be captured with a high-speed camera, and we can utilize the intensity difference between consecutive video frames for various vision tasks. For color constancy, conventional methods usually focus on exploiting only the spatial feature. To overcome the limitations of conventional methods, a couple of methods to utilize AC flickering have been proposed. The previous work employed temporal correlation between high-speed video frames. To further enhance the previous work, we propose a deep spatio-temporal color constancy method using spatial and temporal correlations. To extract temporal features for illuminant estimation, we calculate the temporal correlation between feature maps where global features as well as local are learned. By learning global features through spatio-temporal correlation, the proposed method can estimate illumination more accurately, and is particularly robust to noisy practical environments. The experimental results demonstrate that the performance of the proposed method is superior to that of existing methods.","PeriodicalId":114062,"journal":{"name":"2021 International Conference on Visual Communications and Image Processing (VCIP)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Visual Communications and Image Processing (VCIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VCIP53242.2021.9675406","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

After the invention of electric bulbs, most of lights surrounding our worlds are powered by alternative current (AC). This intensity variation can be captured with a high-speed camera, and we can utilize the intensity difference between consecutive video frames for various vision tasks. For color constancy, conventional methods usually focus on exploiting only the spatial feature. To overcome the limitations of conventional methods, a couple of methods to utilize AC flickering have been proposed. The previous work employed temporal correlation between high-speed video frames. To further enhance the previous work, we propose a deep spatio-temporal color constancy method using spatial and temporal correlations. To extract temporal features for illuminant estimation, we calculate the temporal correlation between feature maps where global features as well as local are learned. By learning global features through spatio-temporal correlation, the proposed method can estimate illumination more accurately, and is particularly robust to noisy practical environments. The experimental results demonstrate that the performance of the proposed method is superior to that of existing methods.
基于时空相关的高速视频深色恒常性
在电灯泡发明之后,我们周围的大多数灯都是由交流电供电的。这种强度变化可以用高速摄像机捕捉到,我们可以利用连续视频帧之间的强度差异来完成各种视觉任务。对于色彩的恒常性,传统的方法通常只侧重于利用空间特征。为了克服传统方法的局限性,提出了几种利用交流闪变的方法。先前的工作采用了高速视频帧之间的时间相关性。为了进一步完善之前的工作,我们提出了一种基于时空相关性的深度时空颜色恒常性方法。为了提取用于光源估计的时间特征,我们计算了学习全局特征和局部特征的特征映射之间的时间相关性。该方法通过时空相关学习全局特征,可以更准确地估计光照,并且对有噪声的实际环境具有很强的鲁棒性。实验结果表明,该方法的性能优于现有方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信