{"title":"量化友好的超分辨率:揭示激活规范化的好处","authors":"Dongjea Kang, Myungjun Son, Hongjae Lee, Seung-Won Jung","doi":"10.1016/j.jvcir.2025.104539","DOIUrl":null,"url":null,"abstract":"<div><div>Super-resolution (SR) has achieved remarkable progress with deep neural networks, but the substantial memory and computational demands of SR networks limit their use in resource-constrained environments. To address these challenges, various quantization methods have been developed, focusing on managing the diverse and asymmetric activation distributions in SR networks. This focus is crucial, as most SR networks exclude batch normalization (BN) due to concerns about image quality degradation from limited activation range flexibility. However, this decision is made in the context of full-precision SR networks, leaving BN’s impact on quantized SR networks uncertain. This paper revisits BN’s role in quantized SR networks, presenting a detailed performance analysis of multiple quantized SR models with and without BN. Experimental results show that including BN in quantized SR networks enhances performance and simplifies network design through minor yet significant structural adjustments. These findings challenge conventional assumptions and offer new insights for SR network optimization.</div></div>","PeriodicalId":54755,"journal":{"name":"Journal of Visual Communication and Image Representation","volume":"111 ","pages":"Article 104539"},"PeriodicalIF":3.1000,"publicationDate":"2025-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Quantization-friendly super-resolution: Unveiling the benefits of activation normalization\",\"authors\":\"Dongjea Kang, Myungjun Son, Hongjae Lee, Seung-Won Jung\",\"doi\":\"10.1016/j.jvcir.2025.104539\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Super-resolution (SR) has achieved remarkable progress with deep neural networks, but the substantial memory and computational demands of SR networks limit their use in resource-constrained environments. To address these challenges, various quantization methods have been developed, focusing on managing the diverse and asymmetric activation distributions in SR networks. This focus is crucial, as most SR networks exclude batch normalization (BN) due to concerns about image quality degradation from limited activation range flexibility. However, this decision is made in the context of full-precision SR networks, leaving BN’s impact on quantized SR networks uncertain. This paper revisits BN’s role in quantized SR networks, presenting a detailed performance analysis of multiple quantized SR models with and without BN. Experimental results show that including BN in quantized SR networks enhances performance and simplifies network design through minor yet significant structural adjustments. These findings challenge conventional assumptions and offer new insights for SR network optimization.</div></div>\",\"PeriodicalId\":54755,\"journal\":{\"name\":\"Journal of Visual Communication and Image Representation\",\"volume\":\"111 \",\"pages\":\"Article 104539\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2025-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Visual Communication and Image Representation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1047320325001531\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Visual Communication and Image Representation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1047320325001531","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Quantization-friendly super-resolution: Unveiling the benefits of activation normalization
Super-resolution (SR) has achieved remarkable progress with deep neural networks, but the substantial memory and computational demands of SR networks limit their use in resource-constrained environments. To address these challenges, various quantization methods have been developed, focusing on managing the diverse and asymmetric activation distributions in SR networks. This focus is crucial, as most SR networks exclude batch normalization (BN) due to concerns about image quality degradation from limited activation range flexibility. However, this decision is made in the context of full-precision SR networks, leaving BN’s impact on quantized SR networks uncertain. This paper revisits BN’s role in quantized SR networks, presenting a detailed performance analysis of multiple quantized SR models with and without BN. Experimental results show that including BN in quantized SR networks enhances performance and simplifies network design through minor yet significant structural adjustments. These findings challenge conventional assumptions and offer new insights for SR network optimization.
期刊介绍:
The Journal of Visual Communication and Image Representation publishes papers on state-of-the-art visual communication and image representation, with emphasis on novel technologies and theoretical work in this multidisciplinary area of pure and applied research. The field of visual communication and image representation is considered in its broadest sense and covers both digital and analog aspects as well as processing and communication in biological visual systems.