基于多维信道信息的聚类剪枝方法

IF 2.6 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Sun Chuanmeng, Chen Jiaxin, Wu Zhibo, Li Yong, Ma Tiehua
{"title":"基于多维信道信息的聚类剪枝方法","authors":"Sun Chuanmeng, Chen Jiaxin, Wu Zhibo, Li Yong, Ma Tiehua","doi":"10.1007/s11063-024-11684-z","DOIUrl":null,"url":null,"abstract":"<p>Pruning convolutional neural networks offers a promising solution to mitigate the computational complexity challenges encountered during application deployment. However, prevalent pruning techniques primarily concentrate on model parameters or feature mapping analysis to devise static pruning strategies, often overlooking the underlying feature extraction capacity of convolutional kernels. To address this, the study first quantitatively expresses the feature extraction capability of convolutional channels from three aspects: global features, distribution metrics, and directional metrics. It explores the multi-dimensional information of the channels, calculates the overall expectation, variance, and cosine distance from the unit vector as the quantitative results of the channels. Subsequently, a clustering algorithm is employed to categorize the multidimensional information. This approach ensures that convolutional channels grouped within each cluster possess similar feature extraction capabilities. An enhanced differential evolutionary algorithm is utilized to optimize the number of clustering centers across all convolutional layers, ensuring optimal grouping. The final step involves achieving channel sparsification through the calculation of crowding distances for each sample within its designated cluster. This preserves a diverse subset of channels that are critical for maintaining model accuracy. Extensive empirical evaluations conducted on three benchmark image classification datasets demonstrate the efficacy of this method. For instance, on the ImageNet dataset, the ResNet-50 model experiences a substantial reduction in FLOPs by 58.43% while incurring a minimal decrease in TOP-1 accuracy of only 1.15%.</p>","PeriodicalId":51144,"journal":{"name":"Neural Processing Letters","volume":"76 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Clustering Pruning Method Based on Multidimensional Channel Information\",\"authors\":\"Sun Chuanmeng, Chen Jiaxin, Wu Zhibo, Li Yong, Ma Tiehua\",\"doi\":\"10.1007/s11063-024-11684-z\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Pruning convolutional neural networks offers a promising solution to mitigate the computational complexity challenges encountered during application deployment. However, prevalent pruning techniques primarily concentrate on model parameters or feature mapping analysis to devise static pruning strategies, often overlooking the underlying feature extraction capacity of convolutional kernels. To address this, the study first quantitatively expresses the feature extraction capability of convolutional channels from three aspects: global features, distribution metrics, and directional metrics. It explores the multi-dimensional information of the channels, calculates the overall expectation, variance, and cosine distance from the unit vector as the quantitative results of the channels. Subsequently, a clustering algorithm is employed to categorize the multidimensional information. This approach ensures that convolutional channels grouped within each cluster possess similar feature extraction capabilities. An enhanced differential evolutionary algorithm is utilized to optimize the number of clustering centers across all convolutional layers, ensuring optimal grouping. The final step involves achieving channel sparsification through the calculation of crowding distances for each sample within its designated cluster. This preserves a diverse subset of channels that are critical for maintaining model accuracy. Extensive empirical evaluations conducted on three benchmark image classification datasets demonstrate the efficacy of this method. For instance, on the ImageNet dataset, the ResNet-50 model experiences a substantial reduction in FLOPs by 58.43% while incurring a minimal decrease in TOP-1 accuracy of only 1.15%.</p>\",\"PeriodicalId\":51144,\"journal\":{\"name\":\"Neural Processing Letters\",\"volume\":\"76 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Processing Letters\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s11063-024-11684-z\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11063-024-11684-z","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

剪枝卷积神经网络为缓解应用部署过程中遇到的计算复杂性挑战提供了一种前景广阔的解决方案。然而,目前流行的剪枝技术主要集中于模型参数或特征映射分析,以设计静态剪枝策略,往往忽略了卷积核的基本特征提取能力。针对这一问题,本研究首先从全局特征、分布度量和方向度量三个方面定量表达了卷积通道的特征提取能力。研究探索了通道的多维信息,计算出单位向量的总期望、方差和余弦距离,作为通道的定量结果。随后,采用聚类算法对多维信息进行分类。这种方法可确保每个聚类中的卷积信道具有相似的特征提取能力。增强型差分进化算法用于优化所有卷积层的聚类中心数量,确保最佳分组。最后一步是通过计算指定聚类中每个样本的拥挤距离来实现通道稀疏化。这就保留了对保持模型准确性至关重要的多样化通道子集。在三个基准图像分类数据集上进行的广泛经验评估证明了这种方法的有效性。例如,在 ImageNet 数据集上,ResNet-50 模型的 FLOPs 大幅减少了 58.43%,而 TOP-1 准确率仅下降了 1.15%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

A Clustering Pruning Method Based on Multidimensional Channel Information

A Clustering Pruning Method Based on Multidimensional Channel Information

Pruning convolutional neural networks offers a promising solution to mitigate the computational complexity challenges encountered during application deployment. However, prevalent pruning techniques primarily concentrate on model parameters or feature mapping analysis to devise static pruning strategies, often overlooking the underlying feature extraction capacity of convolutional kernels. To address this, the study first quantitatively expresses the feature extraction capability of convolutional channels from three aspects: global features, distribution metrics, and directional metrics. It explores the multi-dimensional information of the channels, calculates the overall expectation, variance, and cosine distance from the unit vector as the quantitative results of the channels. Subsequently, a clustering algorithm is employed to categorize the multidimensional information. This approach ensures that convolutional channels grouped within each cluster possess similar feature extraction capabilities. An enhanced differential evolutionary algorithm is utilized to optimize the number of clustering centers across all convolutional layers, ensuring optimal grouping. The final step involves achieving channel sparsification through the calculation of crowding distances for each sample within its designated cluster. This preserves a diverse subset of channels that are critical for maintaining model accuracy. Extensive empirical evaluations conducted on three benchmark image classification datasets demonstrate the efficacy of this method. For instance, on the ImageNet dataset, the ResNet-50 model experiences a substantial reduction in FLOPs by 58.43% while incurring a minimal decrease in TOP-1 accuracy of only 1.15%.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neural Processing Letters
Neural Processing Letters 工程技术-计算机:人工智能
CiteScore
4.90
自引率
12.90%
发文量
392
审稿时长
2.8 months
期刊介绍: Neural Processing Letters is an international journal publishing research results and innovative ideas on all aspects of artificial neural networks. Coverage includes theoretical developments, biological models, new formal modes, learning, applications, software and hardware developments, and prospective researches. The journal promotes fast exchange of information in the community of neural network researchers and users. The resurgence of interest in the field of artificial neural networks since the beginning of the 1980s is coupled to tremendous research activity in specialized or multidisciplinary groups. Research, however, is not possible without good communication between people and the exchange of information, especially in a field covering such different areas; fast communication is also a key aspect, and this is the reason for Neural Processing Letters
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信