Information-Growth Attention Network for Image Super-Resolution

Zhuangzi Li, Ge Li, Thomas H. Li, Shan Liu, Wei Gao
{"title":"Information-Growth Attention Network for Image Super-Resolution","authors":"Zhuangzi Li, Ge Li, Thomas H. Li, Shan Liu, Wei Gao","doi":"10.1145/3474085.3475207","DOIUrl":null,"url":null,"abstract":"It is generally known that a high-resolution (HR) image contains more productive information compared with its low-resolution (LR) versions, so image super-resolution (SR) satisfies an information-growth process. Considering the property, we attempt to exploit the growing information via a particular attention mechanism. In this paper, we propose a concise but effective Information-Growth Attention Network (IGAN) that shows the incremental information is beneficial for SR. Specifically, a novel information-growth attention is proposed. It aims to pay attention to features involving large information-growth capacity by assimilating the difference from current features to the former features within a network. We also illustrate its effectiveness contrasted by widely-used self-attention using entropy and generalization analysis. Furthermore, existing channel-wise attention generation modules (CAGMs) have large informational attenuation due to directly calculating global mean for feature maps. Therefore, we present an innovative CAGM that progressively decreases feature maps' sizes, leading to more adequate feature exploitation. Extensive experiments also demonstrate IGAN outperforms state-of-the-art attention-aware SR approaches.","PeriodicalId":357468,"journal":{"name":"Proceedings of the 29th ACM International Conference on Multimedia","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 29th ACM International Conference on Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3474085.3475207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

It is generally known that a high-resolution (HR) image contains more productive information compared with its low-resolution (LR) versions, so image super-resolution (SR) satisfies an information-growth process. Considering the property, we attempt to exploit the growing information via a particular attention mechanism. In this paper, we propose a concise but effective Information-Growth Attention Network (IGAN) that shows the incremental information is beneficial for SR. Specifically, a novel information-growth attention is proposed. It aims to pay attention to features involving large information-growth capacity by assimilating the difference from current features to the former features within a network. We also illustrate its effectiveness contrasted by widely-used self-attention using entropy and generalization analysis. Furthermore, existing channel-wise attention generation modules (CAGMs) have large informational attenuation due to directly calculating global mean for feature maps. Therefore, we present an innovative CAGM that progressively decreases feature maps' sizes, leading to more adequate feature exploitation. Extensive experiments also demonstrate IGAN outperforms state-of-the-art attention-aware SR approaches.
图像超分辨率信息增长关注网络
众所周知,高分辨率(HR)图像比低分辨率(LR)图像包含更多的生产性信息,因此图像超分辨率(SR)满足信息增长过程。考虑到这一特性,我们试图通过一种特殊的注意力机制来利用不断增长的信息。在本文中,我们提出了一个简洁而有效的信息增长注意网络(IGAN),表明增量信息对sr是有益的。它旨在通过吸收网络中当前特征与以前特征的差异来关注涉及信息增长能力大的特征。我们还利用熵和泛化分析说明了它与广泛使用的自关注的有效性。此外,现有的channel-wise attention generation module (cagm)由于直接计算feature map的全局均值,存在较大的信息衰减。因此,我们提出了一种创新的CAGM,它逐步减小特征映射的大小,从而更充分地利用特征。大量的实验也表明,IGAN优于最先进的注意力感知SR方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书