{"title":"OMLK-Net:一种用于图像超分辨率的在线多尺度大可分离核蒸馏网络","authors":"Hanjia Wei, Weiwei Wang, Xixi Jia, Xiangchu Feng, Chuan Chen","doi":"10.1016/j.sigpro.2025.110078","DOIUrl":null,"url":null,"abstract":"<div><div>Single-image super-resolution (SISR) has seen remarkable progress in recent years, driven by the powerful learning capabilities of large-scale neural networks, such as deep CNNs and Transformers. However, these advances come at the expense of substantial computational costs. Striking a delicate balance between effectiveness and efficiency remains a key challenge in neural network design. This paper proposes OMLK-Net, a novel lightweight architecture for SISR, offering the dual advantages of computational efficiency and high effectiveness. OMLK-Net adopts a divide-and-conquer strategy to separately optimize local and nonlocal feature learning, enabling a lightweight architecture without compromising feature representation effectiveness. Specifically, our OMLK-Net comprises two key modules: an Online Multiscale Distillation Block (OMDB) and Large Separable Shifting Kernel Attention (L2SKA) blocks. The OMDB module aims to explore multiscale local contextual information with a customized lightweight network block; while the L2SKA aims to harness nonlocal features by using computationally efficient large separable shifting kernels. By virtue of its carefully designed local and nonlocal feature extraction operators, OMLK-Net effectively addresses SISR challenges while maintaining low computational complexity. Extensive experimental results on benchmark datasets demonstrate that OMLK-Net achieves a better trade-off against state-of-the-art methods in terms of performance and model complexity. Codes will be available soon.</div></div>","PeriodicalId":49523,"journal":{"name":"Signal Processing","volume":"237 ","pages":"Article 110078"},"PeriodicalIF":3.4000,"publicationDate":"2025-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"OMLK-Net: An Online Multi-scale Large Separable Kernel Distillation Network for efficient image super-resolution\",\"authors\":\"Hanjia Wei, Weiwei Wang, Xixi Jia, Xiangchu Feng, Chuan Chen\",\"doi\":\"10.1016/j.sigpro.2025.110078\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Single-image super-resolution (SISR) has seen remarkable progress in recent years, driven by the powerful learning capabilities of large-scale neural networks, such as deep CNNs and Transformers. However, these advances come at the expense of substantial computational costs. Striking a delicate balance between effectiveness and efficiency remains a key challenge in neural network design. This paper proposes OMLK-Net, a novel lightweight architecture for SISR, offering the dual advantages of computational efficiency and high effectiveness. OMLK-Net adopts a divide-and-conquer strategy to separately optimize local and nonlocal feature learning, enabling a lightweight architecture without compromising feature representation effectiveness. Specifically, our OMLK-Net comprises two key modules: an Online Multiscale Distillation Block (OMDB) and Large Separable Shifting Kernel Attention (L2SKA) blocks. The OMDB module aims to explore multiscale local contextual information with a customized lightweight network block; while the L2SKA aims to harness nonlocal features by using computationally efficient large separable shifting kernels. By virtue of its carefully designed local and nonlocal feature extraction operators, OMLK-Net effectively addresses SISR challenges while maintaining low computational complexity. Extensive experimental results on benchmark datasets demonstrate that OMLK-Net achieves a better trade-off against state-of-the-art methods in terms of performance and model complexity. Codes will be available soon.</div></div>\",\"PeriodicalId\":49523,\"journal\":{\"name\":\"Signal Processing\",\"volume\":\"237 \",\"pages\":\"Article 110078\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2025-05-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Signal Processing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0165168425001926\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165168425001926","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
OMLK-Net: An Online Multi-scale Large Separable Kernel Distillation Network for efficient image super-resolution
Single-image super-resolution (SISR) has seen remarkable progress in recent years, driven by the powerful learning capabilities of large-scale neural networks, such as deep CNNs and Transformers. However, these advances come at the expense of substantial computational costs. Striking a delicate balance between effectiveness and efficiency remains a key challenge in neural network design. This paper proposes OMLK-Net, a novel lightweight architecture for SISR, offering the dual advantages of computational efficiency and high effectiveness. OMLK-Net adopts a divide-and-conquer strategy to separately optimize local and nonlocal feature learning, enabling a lightweight architecture without compromising feature representation effectiveness. Specifically, our OMLK-Net comprises two key modules: an Online Multiscale Distillation Block (OMDB) and Large Separable Shifting Kernel Attention (L2SKA) blocks. The OMDB module aims to explore multiscale local contextual information with a customized lightweight network block; while the L2SKA aims to harness nonlocal features by using computationally efficient large separable shifting kernels. By virtue of its carefully designed local and nonlocal feature extraction operators, OMLK-Net effectively addresses SISR challenges while maintaining low computational complexity. Extensive experimental results on benchmark datasets demonstrate that OMLK-Net achieves a better trade-off against state-of-the-art methods in terms of performance and model complexity. Codes will be available soon.
期刊介绍:
Signal Processing incorporates all aspects of the theory and practice of signal processing. It features original research work, tutorial and review articles, and accounts of practical developments. It is intended for a rapid dissemination of knowledge and experience to engineers and scientists working in the research, development or practical application of signal processing.
Subject areas covered by the journal include: Signal Theory; Stochastic Processes; Detection and Estimation; Spectral Analysis; Filtering; Signal Processing Systems; Software Developments; Image Processing; Pattern Recognition; Optical Signal Processing; Digital Signal Processing; Multi-dimensional Signal Processing; Communication Signal Processing; Biomedical Signal Processing; Geophysical and Astrophysical Signal Processing; Earth Resources Signal Processing; Acoustic and Vibration Signal Processing; Data Processing; Remote Sensing; Signal Processing Technology; Radar Signal Processing; Sonar Signal Processing; Industrial Applications; New Applications.