Lightweight local and global granularity selection optimization network for single image super-resolution

IF 6.3 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Zhihao Peng , Mang Hu , Xinyuan Qi , Sheng Wu , Qianqian Xia , Jianga Shang , Linquan Yang
{"title":"Lightweight local and global granularity selection optimization network for single image super-resolution","authors":"Zhihao Peng ,&nbsp;Mang Hu ,&nbsp;Xinyuan Qi ,&nbsp;Sheng Wu ,&nbsp;Qianqian Xia ,&nbsp;Jianga Shang ,&nbsp;Linquan Yang","doi":"10.1016/j.neunet.2025.108085","DOIUrl":null,"url":null,"abstract":"<div><div>Recently, neural networks that combine local and global granularity features have made significant progress in single image super-resolution (SISR). However, when dealing with local granularity, these models often fuse features from coarse to fine in a linear manner, which leads to redundant feature representations and inefficient information extraction. Additionally, global granularity feature extraction is often compromised by the interference of irrelevant features that reduce the model’s ability to effectively capture global dependencies, ultimately affecting reconstruction quality. In this paper, a lightweight local and global granularity selection optimization network-LGGSONet is proposed to enhance the capability of feature extraction. First, we present a local granularity selection module (LGSM), which applies a novel nonlinear convolution method to dynamically fuse multi-scale features and adaptively select effective information. Next, we design a global granularity optimization module (GGOM), which uses global transposed attention for feature extraction while dynamically filtering out irrelevant spatial fine-grained features. Then, we construct a mixed granularity transformer block (MGTB), combining LGSM and GGOM. Finally, MGTB is integrated into the mixed granularity residual transformer group (MGRTG) to simplify network training. Extensive experiments show that LGGSONet based on MGRTG achieves a PSNR improvement of 0.30 dB over other advanced lightweight methods while maintaining fewer parameters and computational costs.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"193 ","pages":"Article 108085"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025009657","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Recently, neural networks that combine local and global granularity features have made significant progress in single image super-resolution (SISR). However, when dealing with local granularity, these models often fuse features from coarse to fine in a linear manner, which leads to redundant feature representations and inefficient information extraction. Additionally, global granularity feature extraction is often compromised by the interference of irrelevant features that reduce the model’s ability to effectively capture global dependencies, ultimately affecting reconstruction quality. In this paper, a lightweight local and global granularity selection optimization network-LGGSONet is proposed to enhance the capability of feature extraction. First, we present a local granularity selection module (LGSM), which applies a novel nonlinear convolution method to dynamically fuse multi-scale features and adaptively select effective information. Next, we design a global granularity optimization module (GGOM), which uses global transposed attention for feature extraction while dynamically filtering out irrelevant spatial fine-grained features. Then, we construct a mixed granularity transformer block (MGTB), combining LGSM and GGOM. Finally, MGTB is integrated into the mixed granularity residual transformer group (MGRTG) to simplify network training. Extensive experiments show that LGGSONet based on MGRTG achieves a PSNR improvement of 0.30 dB over other advanced lightweight methods while maintaining fewer parameters and computational costs.
单幅图像超分辨率的轻量级局部和全局粒度选择优化网络
近年来,结合局部和全局粒度特征的神经网络在单幅图像超分辨率(SISR)方面取得了重大进展。然而,在处理局部粒度时,这些模型往往以线性方式将特征从粗到细融合,导致特征表示冗余,信息提取效率低下。此外,全局粒度特征提取通常会受到不相关特征的干扰,从而降低模型有效捕获全局依赖关系的能力,最终影响重建质量。为了提高特征提取的能力,本文提出了一种轻量级的局部和全局粒度选择优化网络lggsonet。首先,提出了一种局部粒度选择模块(LGSM),该模块采用一种新颖的非线性卷积方法动态融合多尺度特征,自适应选择有效信息;其次,我们设计了一个全局粒度优化模块(GGOM),该模块使用全局转置注意进行特征提取,同时动态过滤掉不相关的空间细粒度特征。然后,我们结合LGSM和GGOM构造了一个混合粒度变压器块(MGTB)。最后,将MGTB集成到混合粒度残余变压器组(MGRTG)中,简化网络训练。大量实验表明,与其他先进的轻量化方法相比,基于MGRTG的LGGSONet的PSNR提高了0.30 dB,同时保持了更少的参数和计算成本。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信