Wenshu Chen, L. Peng, Yujie Huang, Ming-e Jing, Xiaoyang Zeng
{"title":"基于U-Net的图像去噪知识蒸馏","authors":"Wenshu Chen, L. Peng, Yujie Huang, Ming-e Jing, Xiaoyang Zeng","doi":"10.1109/ASICON52560.2021.9620364","DOIUrl":null,"url":null,"abstract":"In recent years, algorithms based on convolutional neural networks (CNNs) have shown great advantages in image denoising. However, the existing state-of-the-art (SOTA) algorithms are too computationally complex to be deployed on embedded devices, like mobile devices. Knowledge distillation is an effective model compression method. However, researches on knowledge distillation are mainly on high-level visual tasks, like image classification, and few on low-level visual tasks, such as image denoising. To solve the above problems, we propose a novel knowledge distillation method for the U-Net based on image denoising algorithms. The experimental results show that the performance of the compressed model is comparable with the original model in the case of quadruple compression.","PeriodicalId":233584,"journal":{"name":"2021 IEEE 14th International Conference on ASIC (ASICON)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Knowledge Distillation for U-Net Based Image Denoising\",\"authors\":\"Wenshu Chen, L. Peng, Yujie Huang, Ming-e Jing, Xiaoyang Zeng\",\"doi\":\"10.1109/ASICON52560.2021.9620364\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, algorithms based on convolutional neural networks (CNNs) have shown great advantages in image denoising. However, the existing state-of-the-art (SOTA) algorithms are too computationally complex to be deployed on embedded devices, like mobile devices. Knowledge distillation is an effective model compression method. However, researches on knowledge distillation are mainly on high-level visual tasks, like image classification, and few on low-level visual tasks, such as image denoising. To solve the above problems, we propose a novel knowledge distillation method for the U-Net based on image denoising algorithms. The experimental results show that the performance of the compressed model is comparable with the original model in the case of quadruple compression.\",\"PeriodicalId\":233584,\"journal\":{\"name\":\"2021 IEEE 14th International Conference on ASIC (ASICON)\",\"volume\":\"77 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 14th International Conference on ASIC (ASICON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ASICON52560.2021.9620364\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 14th International Conference on ASIC (ASICON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ASICON52560.2021.9620364","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Knowledge Distillation for U-Net Based Image Denoising
In recent years, algorithms based on convolutional neural networks (CNNs) have shown great advantages in image denoising. However, the existing state-of-the-art (SOTA) algorithms are too computationally complex to be deployed on embedded devices, like mobile devices. Knowledge distillation is an effective model compression method. However, researches on knowledge distillation are mainly on high-level visual tasks, like image classification, and few on low-level visual tasks, such as image denoising. To solve the above problems, we propose a novel knowledge distillation method for the U-Net based on image denoising algorithms. The experimental results show that the performance of the compressed model is comparable with the original model in the case of quadruple compression.