Frequency-Split Inception Transformer for Image Super-Resolution

Wei Xu
{"title":"Frequency-Split Inception Transformer for Image Super-Resolution","authors":"Wei Xu","doi":"10.1145/3609703.3609708","DOIUrl":null,"url":null,"abstract":"Transformer models have shown remarkable effectiveness in capturing long-range dependencies and extracting features for single image super-resolution. However, their deployment on edge devices is hindered by their high computational complexity. To address this challenge, we propose Inception Swin Transformer (IST), a novel model that leverages frequency domain separation to reduce redundant computations.In IST, we exploit the strengths of both CNN-based networks and Transformer variants to handle high-frequency and low-frequency features, respectively. By dynamically utilizing frequency factors to separate feature maps, IST ensures that different components are processed appropriately. Additionally, IST maintains a balanced trade-off between model speed and performance by gradually reducing the proportion of high-frequency components.Our experiments demonstrate that IST effectively reduces the FLOPs while preserving high performance. The combination of Transformers’ accuracy and CNN variants’ efficiency enables IST to significantly reduce computational strain without compromising quality. Comparative analysis reveals that IST outperforms other models, achieving superior results with less FLOPs.","PeriodicalId":101485,"journal":{"name":"Proceedings of the 2023 5th International Conference on Pattern Recognition and Intelligent Systems","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 5th International Conference on Pattern Recognition and Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3609703.3609708","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Transformer models have shown remarkable effectiveness in capturing long-range dependencies and extracting features for single image super-resolution. However, their deployment on edge devices is hindered by their high computational complexity. To address this challenge, we propose Inception Swin Transformer (IST), a novel model that leverages frequency domain separation to reduce redundant computations.In IST, we exploit the strengths of both CNN-based networks and Transformer variants to handle high-frequency and low-frequency features, respectively. By dynamically utilizing frequency factors to separate feature maps, IST ensures that different components are processed appropriately. Additionally, IST maintains a balanced trade-off between model speed and performance by gradually reducing the proportion of high-frequency components.Our experiments demonstrate that IST effectively reduces the FLOPs while preserving high performance. The combination of Transformers’ accuracy and CNN variants’ efficiency enables IST to significantly reduce computational strain without compromising quality. Comparative analysis reveals that IST outperforms other models, achieving superior results with less FLOPs.
用于图像超分辨率的分频启始变压器
Transformer模型在捕获远程依赖关系和提取单幅图像超分辨率特征方面表现出显著的有效性。然而,它们在边缘设备上的部署受到其高计算复杂性的阻碍。为了解决这一挑战,我们提出了Inception Swin Transformer (IST),这是一种利用频域分离来减少冗余计算的新模型。在IST中,我们利用了基于cnn的网络和Transformer变体的优势来分别处理高频和低频特征。通过动态地利用频率因子来分离特征映射,IST确保了不同的成分得到适当的处理。此外,IST通过逐渐减少高频组件的比例,在模型速度和性能之间保持平衡。我们的实验表明,IST在保持高性能的同时有效地降低了FLOPs。变压器的精度和CNN变体的效率相结合,使IST能够在不影响质量的情况下显着减少计算应变。对比分析表明,IST优于其他模型,以更少的FLOPs获得更优的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信