FreePrune: An Automatic Pruning Framework Across Various Granularities Based on Training-Free Evaluation

IF 2.7 3区 计算机科学 Q2 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
Miao Tang;Ning Liu;Tao Yang;Haining Fang;Qiu Lin;Yujuan Tan;Xianzhang Chen;Duo Liu;Kan Zhong;Ao Ren
{"title":"FreePrune: An Automatic Pruning Framework Across Various Granularities Based on Training-Free Evaluation","authors":"Miao Tang;Ning Liu;Tao Yang;Haining Fang;Qiu Lin;Yujuan Tan;Xianzhang Chen;Duo Liu;Kan Zhong;Ao Ren","doi":"10.1109/TCAD.2024.3443694","DOIUrl":null,"url":null,"abstract":"Network pruning is an effective technique that reduces the computational costs of networks while maintaining accuracy. However, pruning requires expert knowledge and hyperparameter tuning, such as determining the pruning rate for each layer. Automatic pruning methods address this challenge by proposing an effective training-free metric to quickly evaluate the pruned network without fine-tuning. However, most existing automatic pruning methods only investigate a certain pruning granularity, and it remains unclear whether metrics benefit automatic pruning at different granularities. Neural architecture search also studies training-free metrics to accelerate network generation. Nevertheless, whether they apply to pruning needs further investigation. In this study, we first systematically analyze various advanced training-free metrics for various granularities in pruning, and then we investigate the correlation between the training-free metric score and the after-fine-tuned model accuracy. Based on the analysis, we proposed FreePrune score, a more general metric compatible with all pruning granularities. Aiming at generating high-quality pruned networks and unleashing the power of FreePrune score, we further propose FreePrune, an automatic framework that can rapidly generate and evaluate the candidate networks, leading to a final pruned network with both high accuracy and pruning rate. Experiments show that our method achieves high correlation on various pruning granularities and comprehensively improves the accuracy.","PeriodicalId":13251,"journal":{"name":"IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems","volume":"43 11","pages":"4033-4044"},"PeriodicalIF":2.7000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10745854/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

Abstract

Network pruning is an effective technique that reduces the computational costs of networks while maintaining accuracy. However, pruning requires expert knowledge and hyperparameter tuning, such as determining the pruning rate for each layer. Automatic pruning methods address this challenge by proposing an effective training-free metric to quickly evaluate the pruned network without fine-tuning. However, most existing automatic pruning methods only investigate a certain pruning granularity, and it remains unclear whether metrics benefit automatic pruning at different granularities. Neural architecture search also studies training-free metrics to accelerate network generation. Nevertheless, whether they apply to pruning needs further investigation. In this study, we first systematically analyze various advanced training-free metrics for various granularities in pruning, and then we investigate the correlation between the training-free metric score and the after-fine-tuned model accuracy. Based on the analysis, we proposed FreePrune score, a more general metric compatible with all pruning granularities. Aiming at generating high-quality pruned networks and unleashing the power of FreePrune score, we further propose FreePrune, an automatic framework that can rapidly generate and evaluate the candidate networks, leading to a final pruned network with both high accuracy and pruning rate. Experiments show that our method achieves high correlation on various pruning granularities and comprehensively improves the accuracy.
FreePrune:基于免训练评估的跨粒度自动剪枝框架
网络剪枝是一种有效的技术,可在保持准确性的同时降低网络的计算成本。然而,剪枝需要专家知识和超参数调整,例如确定每一层的剪枝率。自动剪枝方法通过提出有效的免训练度量来解决这一难题,无需微调即可快速评估剪枝后的网络。然而,大多数现有的自动剪枝方法只研究了特定的剪枝粒度,目前仍不清楚不同粒度的自动剪枝指标是否有益于自动剪枝。神经架构搜索也研究了免训练指标,以加速网络生成。然而,这些指标是否适用于剪枝还需要进一步研究。在本研究中,我们首先系统分析了剪枝中不同粒度的各种高级免训练度量,然后研究了免训练度量得分与微调后模型准确性之间的相关性。在分析的基础上,我们提出了 FreePrune 分数,这是一种与所有剪枝粒度兼容的通用指标。为了生成高质量的剪枝网络并释放 FreePrune score 的威力,我们进一步提出了 FreePrune 这一自动框架,它可以快速生成和评估候选网络,从而生成准确率和剪枝率都很高的最终剪枝网络。实验表明,我们的方法在各种剪枝粒度上都实现了高相关性,并全面提高了准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.60
自引率
13.80%
发文量
500
审稿时长
7 months
期刊介绍: The purpose of this Transactions is to publish papers of interest to individuals in the area of computer-aided design of integrated circuits and systems composed of analog, digital, mixed-signal, optical, or microwave components. The aids include methods, models, algorithms, and man-machine interfaces for system-level, physical and logical design including: planning, synthesis, partitioning, modeling, simulation, layout, verification, testing, hardware-software co-design and documentation of integrated circuit and system designs of all complexities. Design tools and techniques for evaluating and designing integrated circuits and systems for metrics such as performance, power, reliability, testability, and security are a focus.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信