Light field image super-resolution using a Content-Aware Spatial–Angular Interaction network

IF 3.7 2区 工程技术 Q1 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
Wuzhen Shi , Jiajun Su , Yang Wen , Yutao Liu
{"title":"Light field image super-resolution using a Content-Aware Spatial–Angular Interaction network","authors":"Wuzhen Shi ,&nbsp;Jiajun Su ,&nbsp;Yang Wen ,&nbsp;Yutao Liu","doi":"10.1016/j.displa.2024.102782","DOIUrl":null,"url":null,"abstract":"<div><p>Light field images record a series of viewpoints of a scene and thus have many attractive applications. However, the trade-off between angular and spatial resolution in the imaging process makes light field image super-resolution necessary. In this paper, we propose a Content-Aware Spatial–Angular Interaction (dubbed CASAI) network for light field image super-resolution. The gradient branch of CASAI makes full use of the context information of the low-resolution gradient map and the multi-level features of the super-resolution branch to generate the high-resolution gradient map, which enables the awareness of structure, texture and detail, and provides effective prior knowledge for the super-resolution process. The super-resolution branch of CASAI generates high-quality super-resolution images by making full use of intra-view (i.e., spatial) and inter-view (i.e., angular) information through a spatial–angular adaptive interaction block and using the high-resolution gradient prior as guidance. The spatial–angular adaptive interaction block enables the awareness of the different importance of spatial features and angular features, so as to better integrate intra-view and inter-view information to improve the performance of light field image super-resolution. The experimental results indicate that when the upsampling factor is 4, our method outperforms our baseline (LF-InterNet) with an average PSNR increase of 0.48 dB, while also achieving the best SSIM. Visualization results demonstrate the advantage of our method in simultaneously generating natural SR images and restoring structures.</p></div>","PeriodicalId":50570,"journal":{"name":"Displays","volume":"84 ","pages":"Article 102782"},"PeriodicalIF":3.7000,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Displays","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S014193822400146X","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

Abstract

Light field images record a series of viewpoints of a scene and thus have many attractive applications. However, the trade-off between angular and spatial resolution in the imaging process makes light field image super-resolution necessary. In this paper, we propose a Content-Aware Spatial–Angular Interaction (dubbed CASAI) network for light field image super-resolution. The gradient branch of CASAI makes full use of the context information of the low-resolution gradient map and the multi-level features of the super-resolution branch to generate the high-resolution gradient map, which enables the awareness of structure, texture and detail, and provides effective prior knowledge for the super-resolution process. The super-resolution branch of CASAI generates high-quality super-resolution images by making full use of intra-view (i.e., spatial) and inter-view (i.e., angular) information through a spatial–angular adaptive interaction block and using the high-resolution gradient prior as guidance. The spatial–angular adaptive interaction block enables the awareness of the different importance of spatial features and angular features, so as to better integrate intra-view and inter-view information to improve the performance of light field image super-resolution. The experimental results indicate that when the upsampling factor is 4, our method outperforms our baseline (LF-InterNet) with an average PSNR increase of 0.48 dB, while also achieving the best SSIM. Visualization results demonstrate the advantage of our method in simultaneously generating natural SR images and restoring structures.

使用内容感知空间-角度交互网络实现光场图像超分辨率
光场图像记录了场景的一系列视点,因此有许多极具吸引力的应用。然而,在成像过程中,角度和空间分辨率之间的权衡使得光场图像超分辨率成为必要。在本文中,我们提出了一种用于光场图像超分辨率的内容感知空间-角度交互(Content-Aware Spatial-Angular Interaction,简称 CASAI)网络。CASAI 的梯度分支充分利用低分辨率梯度图的上下文信息和超分辨率分支的多层次特征生成高分辨率梯度图,从而实现对结构、纹理和细节的感知,为超分辨率过程提供有效的先验知识。CASAI 的超分辨率分支通过空间-角度自适应交互块,充分利用视图内(即空间)和视图间(即角度)信息,并以高分辨率梯度先验知识为指导,生成高质量的超分辨率图像。空间-角度自适应交互块能够意识到空间特征和角度特征的不同重要性,从而更好地整合视内和视间信息,提高光场图像超分辨率的性能。实验结果表明,当上采样系数为 4 时,我们的方法优于基线方法(LF-InterNet),平均 PSNR 提高了 0.48 dB,同时还获得了最佳的 SSIM。可视化结果证明了我们的方法在同时生成自然 SR 图像和还原结构方面的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Displays
Displays 工程技术-工程:电子与电气
CiteScore
4.60
自引率
25.60%
发文量
138
审稿时长
92 days
期刊介绍: Displays is the international journal covering the research and development of display technology, its effective presentation and perception of information, and applications and systems including display-human interface. Technical papers on practical developments in Displays technology provide an effective channel to promote greater understanding and cross-fertilization across the diverse disciplines of the Displays community. Original research papers solving ergonomics issues at the display-human interface advance effective presentation of information. Tutorial papers covering fundamentals intended for display technologies and human factor engineers new to the field will also occasionally featured.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信