Ming Ma, Wenhui Li, Tongzhou Zhang, Ziming Wang, Ying Wang
{"title":"CF-FFP:一种从粗到精的快速滤波剪枝框架","authors":"Ming Ma, Wenhui Li, Tongzhou Zhang, Ziming Wang, Ying Wang","doi":"10.1016/j.patrec.2025.07.024","DOIUrl":null,"url":null,"abstract":"<div><div>Conventional pruning paradigm typically determines the pruned network structure while identifying and removing filters, requiring iterative pruning and fine-tuning, which incurs substantial time and computational costs. Moreover, existing methods overly emphasize the importance of individual filters while neglecting the optimization of the overall network structure, resulting in performance degradation. In this letter, a new Coarse-to-Fine Fast Filter Pruning (CF-FFP) framework is proposed, which decomposes the conventional pruning paradigm into two offline learning stages to achieve fast and efficient model compression. Specifically, the pruned network structure is coarsely determined based on the importance of weights, and an adaptive balancing strategy is proposed to address the issue of significant pruning rate differences across layers. Then, a dual redundancy screening criterion is proposed to finely identify and prune redundant filters based on their similarity and contribution, thereby initializing the pruned network structure. Thanks to CF-FFP’s two-stage offline pruning process, which progresses from coarse to fine, the pruning inference time is significantly reduced. Extensive experiments show that our method outperforms the state-of-the-art methods on CIFAR-10 and ImageNet datasets. For instance, CF-FFP prunes 51.2% FLOPs of ResNet50 on the ImageNet dataset with only 0.67% drop in Top-1 accuracy.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"197 ","pages":"Pages 139-145"},"PeriodicalIF":3.3000,"publicationDate":"2025-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"CF-FFP: A Coarse-to-Fine Fast Filter Pruning framework\",\"authors\":\"Ming Ma, Wenhui Li, Tongzhou Zhang, Ziming Wang, Ying Wang\",\"doi\":\"10.1016/j.patrec.2025.07.024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Conventional pruning paradigm typically determines the pruned network structure while identifying and removing filters, requiring iterative pruning and fine-tuning, which incurs substantial time and computational costs. Moreover, existing methods overly emphasize the importance of individual filters while neglecting the optimization of the overall network structure, resulting in performance degradation. In this letter, a new Coarse-to-Fine Fast Filter Pruning (CF-FFP) framework is proposed, which decomposes the conventional pruning paradigm into two offline learning stages to achieve fast and efficient model compression. Specifically, the pruned network structure is coarsely determined based on the importance of weights, and an adaptive balancing strategy is proposed to address the issue of significant pruning rate differences across layers. Then, a dual redundancy screening criterion is proposed to finely identify and prune redundant filters based on their similarity and contribution, thereby initializing the pruned network structure. Thanks to CF-FFP’s two-stage offline pruning process, which progresses from coarse to fine, the pruning inference time is significantly reduced. Extensive experiments show that our method outperforms the state-of-the-art methods on CIFAR-10 and ImageNet datasets. For instance, CF-FFP prunes 51.2% FLOPs of ResNet50 on the ImageNet dataset with only 0.67% drop in Top-1 accuracy.</div></div>\",\"PeriodicalId\":54638,\"journal\":{\"name\":\"Pattern Recognition Letters\",\"volume\":\"197 \",\"pages\":\"Pages 139-145\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2025-08-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pattern Recognition Letters\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167865525002764\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865525002764","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
CF-FFP: A Coarse-to-Fine Fast Filter Pruning framework
Conventional pruning paradigm typically determines the pruned network structure while identifying and removing filters, requiring iterative pruning and fine-tuning, which incurs substantial time and computational costs. Moreover, existing methods overly emphasize the importance of individual filters while neglecting the optimization of the overall network structure, resulting in performance degradation. In this letter, a new Coarse-to-Fine Fast Filter Pruning (CF-FFP) framework is proposed, which decomposes the conventional pruning paradigm into two offline learning stages to achieve fast and efficient model compression. Specifically, the pruned network structure is coarsely determined based on the importance of weights, and an adaptive balancing strategy is proposed to address the issue of significant pruning rate differences across layers. Then, a dual redundancy screening criterion is proposed to finely identify and prune redundant filters based on their similarity and contribution, thereby initializing the pruned network structure. Thanks to CF-FFP’s two-stage offline pruning process, which progresses from coarse to fine, the pruning inference time is significantly reduced. Extensive experiments show that our method outperforms the state-of-the-art methods on CIFAR-10 and ImageNet datasets. For instance, CF-FFP prunes 51.2% FLOPs of ResNet50 on the ImageNet dataset with only 0.67% drop in Top-1 accuracy.
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.