{"title":"一种用于神经网络模型自动压缩和加速的有效信道修剪算法","authors":"Wei Xie, Xiaobo Feng","doi":"10.1117/12.2660854","DOIUrl":null,"url":null,"abstract":"The Convolutional Neural Network (CNN) enables deep neural networks to be deployed to resource-constrained mobile devices via model compression and acceleration. At present, channel pruning methods select channels based on channel importance or designed regularization, which are suboptimal pruning and cannot be automated. In this paper, a channel pruning algorithm is proposed to get the optimal pruned structure via automatic searching. By setting the super-parameter constraint set, the combination number of pruning structures is reduced. The number of channels for each layer of the CNN is determined using the sparrow search algorithm, and the optimal pruned structure of the model is found. The results of extensive experiments show that the proposed method can improve the model's parameter compression ratio and reduce the number of FLOPS within the acceptable range of model accuracy loss.","PeriodicalId":220312,"journal":{"name":"International Symposium on Computer Engineering and Intelligent Communications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An efficient channel pruning algorithm for automatic compression and acceleration of neural network models\",\"authors\":\"Wei Xie, Xiaobo Feng\",\"doi\":\"10.1117/12.2660854\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Convolutional Neural Network (CNN) enables deep neural networks to be deployed to resource-constrained mobile devices via model compression and acceleration. At present, channel pruning methods select channels based on channel importance or designed regularization, which are suboptimal pruning and cannot be automated. In this paper, a channel pruning algorithm is proposed to get the optimal pruned structure via automatic searching. By setting the super-parameter constraint set, the combination number of pruning structures is reduced. The number of channels for each layer of the CNN is determined using the sparrow search algorithm, and the optimal pruned structure of the model is found. The results of extensive experiments show that the proposed method can improve the model's parameter compression ratio and reduce the number of FLOPS within the acceptable range of model accuracy loss.\",\"PeriodicalId\":220312,\"journal\":{\"name\":\"International Symposium on Computer Engineering and Intelligent Communications\",\"volume\":\"75 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Symposium on Computer Engineering and Intelligent Communications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.2660854\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Computer Engineering and Intelligent Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2660854","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An efficient channel pruning algorithm for automatic compression and acceleration of neural network models
The Convolutional Neural Network (CNN) enables deep neural networks to be deployed to resource-constrained mobile devices via model compression and acceleration. At present, channel pruning methods select channels based on channel importance or designed regularization, which are suboptimal pruning and cannot be automated. In this paper, a channel pruning algorithm is proposed to get the optimal pruned structure via automatic searching. By setting the super-parameter constraint set, the combination number of pruning structures is reduced. The number of channels for each layer of the CNN is determined using the sparrow search algorithm, and the optimal pruned structure of the model is found. The results of extensive experiments show that the proposed method can improve the model's parameter compression ratio and reduce the number of FLOPS within the acceptable range of model accuracy loss.