{"title":"Pruning Networks Using Filters Similarity Stability","authors":"Haicheng Qu, Xuecong Zhang","doi":"10.1145/3584871.3584908","DOIUrl":null,"url":null,"abstract":"Current filter pruning methods rely too much on pretrained weights and have many super parameters, resulting in obvious performance degradation and too long parameters adjustment time. In our research, we found that the cosine similarity distribution between filters can achieve stable in a few epochs during training. Therefore, a cluster pruning method named ECP(Early Cluster Pruning) based on the cosine similarity between filters in the early stage of training is proposed to compress the deep neural networks. First, in the early stage of training, the filters were clustered with a gradually increasing threshold, and then the reserved filters were selected randomly in each cluster. The pruned models could be obtained with only a few super parameters and a single training progress, leading to an obvious reduction in algorithmic complexity and large savings in training time. The experimental results on CIFAR-10 and CIFAR-100 datasets show that ECP method outperforms recent pruning methods in terms of model accuracy maintenance, training time, and model compression rate.","PeriodicalId":173315,"journal":{"name":"Proceedings of the 2023 6th International Conference on Software Engineering and Information Management","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 6th International Conference on Software Engineering and Information Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3584871.3584908","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Current filter pruning methods rely too much on pretrained weights and have many super parameters, resulting in obvious performance degradation and too long parameters adjustment time. In our research, we found that the cosine similarity distribution between filters can achieve stable in a few epochs during training. Therefore, a cluster pruning method named ECP(Early Cluster Pruning) based on the cosine similarity between filters in the early stage of training is proposed to compress the deep neural networks. First, in the early stage of training, the filters were clustered with a gradually increasing threshold, and then the reserved filters were selected randomly in each cluster. The pruned models could be obtained with only a few super parameters and a single training progress, leading to an obvious reduction in algorithmic complexity and large savings in training time. The experimental results on CIFAR-10 and CIFAR-100 datasets show that ECP method outperforms recent pruning methods in terms of model accuracy maintenance, training time, and model compression rate.