Tuan Nghia Nguyen, X. Nguyen, Kyujoong Lee, Hyuk-Jae Lee
{"title":"An Efficient Neural Network Design for Image Super-Resolution with Knowledge Distillation","authors":"Tuan Nghia Nguyen, X. Nguyen, Kyujoong Lee, Hyuk-Jae Lee","doi":"10.1109/ITC-CSCC58803.2023.10212926","DOIUrl":null,"url":null,"abstract":"This paper proposes a new neural network design for efficient image super-resolution inference. Employing complex-simple sub-networks, the proposed design samples feature to dynamically choose an execution path, leading to considerable computation reduction. However, uniformly random sampling generally causes a large accuracy drop due to highly different feature maps obtained by the sub-networks. To address the problem, we propose two simple yet effective techniques considering both the training and inference stages. First, Knowledge Distillation is utilized during training to minimize the feature map difference. Second, a gradient image which is obtained with the well-known Sobel filter guides the sampling by assigning points on edge and texture regions to the complex sub-network. The experimental results show that the proposed design reduces 50% of computations when only 20% of feature maps are computed by the complex sub-network. More importantly, the proposed sampling method enhances the restoration accuracy by 0.3 dB on average compared to the uniformly random sampling method.","PeriodicalId":220939,"journal":{"name":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITC-CSCC58803.2023.10212926","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper proposes a new neural network design for efficient image super-resolution inference. Employing complex-simple sub-networks, the proposed design samples feature to dynamically choose an execution path, leading to considerable computation reduction. However, uniformly random sampling generally causes a large accuracy drop due to highly different feature maps obtained by the sub-networks. To address the problem, we propose two simple yet effective techniques considering both the training and inference stages. First, Knowledge Distillation is utilized during training to minimize the feature map difference. Second, a gradient image which is obtained with the well-known Sobel filter guides the sampling by assigning points on edge and texture regions to the complex sub-network. The experimental results show that the proposed design reduces 50% of computations when only 20% of feature maps are computed by the complex sub-network. More importantly, the proposed sampling method enhances the restoration accuracy by 0.3 dB on average compared to the uniformly random sampling method.