SRFLOW-Attention:通过多头注意力实现超分辨率,从而有效利用低分辨率信息

IF 1 4区 工程技术 Q4 ENGINEERING, ELECTRICAL & ELECTRONIC
Shinya Ohtani, Nobutaka Kuroki, Masahiro Numa
{"title":"SRFLOW-Attention:通过多头注意力实现超分辨率,从而有效利用低分辨率信息","authors":"Shinya Ohtani,&nbsp;Nobutaka Kuroki,&nbsp;Masahiro Numa","doi":"10.1002/tee.24086","DOIUrl":null,"url":null,"abstract":"<p>This paper proposes normalizing flow-based image super-resolution techniques using attention modules. In the proposed method, the features of the low-resolution images are extracted using a Swin Transformer. Furthermore, multi-head attention in the flow layers makes effective use of the feature maps. This architecture enables the efficient injection of low-resolution image features extracted by the transformer into the flow layer. Experimental results at x4 magnifications showed that the proposed method achieved state-of-the-art performance for quantitative metrics and visual quality among single-loss architectures. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.</p>","PeriodicalId":13435,"journal":{"name":"IEEJ Transactions on Electrical and Electronic Engineering","volume":"19 8","pages":"1360-1368"},"PeriodicalIF":1.0000,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SRFLOW-Attention: Super Resolution with Multi-Head Attention for Effective Use of Low-Resolution Information\",\"authors\":\"Shinya Ohtani,&nbsp;Nobutaka Kuroki,&nbsp;Masahiro Numa\",\"doi\":\"10.1002/tee.24086\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This paper proposes normalizing flow-based image super-resolution techniques using attention modules. In the proposed method, the features of the low-resolution images are extracted using a Swin Transformer. Furthermore, multi-head attention in the flow layers makes effective use of the feature maps. This architecture enables the efficient injection of low-resolution image features extracted by the transformer into the flow layer. Experimental results at x4 magnifications showed that the proposed method achieved state-of-the-art performance for quantitative metrics and visual quality among single-loss architectures. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.</p>\",\"PeriodicalId\":13435,\"journal\":{\"name\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"volume\":\"19 8\",\"pages\":\"1360-1368\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2024-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/tee.24086\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEJ Transactions on Electrical and Electronic Engineering","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/tee.24086","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

本文利用注意力模块提出了基于流量的图像超分辨率归一化技术。在所提出的方法中,低分辨率图像的特征是通过斯温变换器提取的。此外,流量层中的多头注意力可有效利用特征图。这种架构可以将变换器提取的低分辨率图像特征有效地注入到流程层中。在 x4 倍率下的实验结果表明,在单损耗架构中,所提出的方法在定量指标和视觉质量方面都达到了最先进的性能。© 2024 日本电气工程师学会和 Wiley Periodicals LLC。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SRFLOW-Attention: Super Resolution with Multi-Head Attention for Effective Use of Low-Resolution Information

This paper proposes normalizing flow-based image super-resolution techniques using attention modules. In the proposed method, the features of the low-resolution images are extracted using a Swin Transformer. Furthermore, multi-head attention in the flow layers makes effective use of the feature maps. This architecture enables the efficient injection of low-resolution image features extracted by the transformer into the flow layer. Experimental results at x4 magnifications showed that the proposed method achieved state-of-the-art performance for quantitative metrics and visual quality among single-loss architectures. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEJ Transactions on Electrical and Electronic Engineering
IEEJ Transactions on Electrical and Electronic Engineering 工程技术-工程:电子与电气
CiteScore
2.70
自引率
10.00%
发文量
199
审稿时长
4.3 months
期刊介绍: IEEJ Transactions on Electrical and Electronic Engineering (hereinafter called TEEE ) publishes 6 times per year as an official journal of the Institute of Electrical Engineers of Japan (hereinafter "IEEJ"). This peer-reviewed journal contains original research papers and review articles on the most important and latest technological advances in core areas of Electrical and Electronic Engineering and in related disciplines. The journal also publishes short communications reporting on the results of the latest research activities TEEE ) aims to provide a new forum for IEEJ members in Japan as well as fellow researchers in Electrical and Electronic Engineering from around the world to exchange ideas and research findings.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信