{"title":"Improvement of Pruning Method for Convolution Neural Network Compression","authors":"Chongyang Liu, Qinrang Liu","doi":"10.1145/3234804.3234824","DOIUrl":null,"url":null,"abstract":"The large number of parameters in convolutional neural network (CNN) makes it a computationally intensive and storage-intensive network model. Although the effect of CNN is prominent in various identification and classification tasks, it is difficult to deploy on embedded devices because the model is too large. In order to solve this problem, an improved scheme for pruning operations in compression methods is proposed. First, the distribution of network connection is analyzed so as to determine the pruning threshold initially; then, using the pruning method to delete connections whose weights are less than the threshold, make the network quickly reach the limit of pruning but maintain accuracy. The verification experiment was performed on the Lenet-5 network which trained on the MINST data set and Lenet-5 was compressed 10.56 times without loss of accuracy.","PeriodicalId":118446,"journal":{"name":"International Conference on Deep Learning Technologies","volume":"409 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Deep Learning Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3234804.3234824","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
The large number of parameters in convolutional neural network (CNN) makes it a computationally intensive and storage-intensive network model. Although the effect of CNN is prominent in various identification and classification tasks, it is difficult to deploy on embedded devices because the model is too large. In order to solve this problem, an improved scheme for pruning operations in compression methods is proposed. First, the distribution of network connection is analyzed so as to determine the pruning threshold initially; then, using the pruning method to delete connections whose weights are less than the threshold, make the network quickly reach the limit of pruning but maintain accuracy. The verification experiment was performed on the Lenet-5 network which trained on the MINST data set and Lenet-5 was compressed 10.56 times without loss of accuracy.