Zhihao Peng , Mang Hu , Xinyuan Qi , Sheng Wu , Qianqian Xia , Jianga Shang , Linquan Yang
{"title":"Lightweight local and global granularity selection optimization network for single image super-resolution","authors":"Zhihao Peng , Mang Hu , Xinyuan Qi , Sheng Wu , Qianqian Xia , Jianga Shang , Linquan Yang","doi":"10.1016/j.neunet.2025.108085","DOIUrl":null,"url":null,"abstract":"<div><div>Recently, neural networks that combine local and global granularity features have made significant progress in single image super-resolution (SISR). However, when dealing with local granularity, these models often fuse features from coarse to fine in a linear manner, which leads to redundant feature representations and inefficient information extraction. Additionally, global granularity feature extraction is often compromised by the interference of irrelevant features that reduce the model’s ability to effectively capture global dependencies, ultimately affecting reconstruction quality. In this paper, a lightweight local and global granularity selection optimization network-LGGSONet is proposed to enhance the capability of feature extraction. First, we present a local granularity selection module (LGSM), which applies a novel nonlinear convolution method to dynamically fuse multi-scale features and adaptively select effective information. Next, we design a global granularity optimization module (GGOM), which uses global transposed attention for feature extraction while dynamically filtering out irrelevant spatial fine-grained features. Then, we construct a mixed granularity transformer block (MGTB), combining LGSM and GGOM. Finally, MGTB is integrated into the mixed granularity residual transformer group (MGRTG) to simplify network training. Extensive experiments show that LGGSONet based on MGRTG achieves a PSNR improvement of 0.30 dB over other advanced lightweight methods while maintaining fewer parameters and computational costs.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"193 ","pages":"Article 108085"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025009657","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, neural networks that combine local and global granularity features have made significant progress in single image super-resolution (SISR). However, when dealing with local granularity, these models often fuse features from coarse to fine in a linear manner, which leads to redundant feature representations and inefficient information extraction. Additionally, global granularity feature extraction is often compromised by the interference of irrelevant features that reduce the model’s ability to effectively capture global dependencies, ultimately affecting reconstruction quality. In this paper, a lightweight local and global granularity selection optimization network-LGGSONet is proposed to enhance the capability of feature extraction. First, we present a local granularity selection module (LGSM), which applies a novel nonlinear convolution method to dynamically fuse multi-scale features and adaptively select effective information. Next, we design a global granularity optimization module (GGOM), which uses global transposed attention for feature extraction while dynamically filtering out irrelevant spatial fine-grained features. Then, we construct a mixed granularity transformer block (MGTB), combining LGSM and GGOM. Finally, MGTB is integrated into the mixed granularity residual transformer group (MGRTG) to simplify network training. Extensive experiments show that LGGSONet based on MGRTG achieves a PSNR improvement of 0.30 dB over other advanced lightweight methods while maintaining fewer parameters and computational costs.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.