Exploiting similarity-induced redundancies in correlation topology for channel pruning in deep convolutional neural networks

Q2 Computer Science
J. Liu, H. Shao, X. Deng, Y. T. Jiang
{"title":"Exploiting similarity-induced redundancies in correlation topology for channel pruning in deep convolutional neural networks","authors":"J. Liu, H. Shao, X. Deng, Y. T. Jiang","doi":"10.1080/1206212X.2023.2218061","DOIUrl":null,"url":null,"abstract":"The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).","PeriodicalId":39673,"journal":{"name":"International Journal of Computers and Applications","volume":"95 1","pages":"379 - 390"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computers and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/1206212X.2023.2218061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0

Abstract

The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).
利用相关拓扑的相似性诱导冗余进行深度卷积神经网络的通道剪枝
本文讨论了卷积神经网络(cnn)在实际应用中由于其复杂的结构(主要是隐藏层)而导致的高计算成本。为了克服这一问题,本文提出了一种新的通道修剪技术,该技术利用每个cnn层生成的特征映射的相关拓扑来构建节点较少的网络,从而显著降低了计算成本。冗余通道表现出高度的拓扑相似性,并随着网络层数的增加而增加。删除与高度相关的特征映射对应的通道,可以检索后续层所需的“基本”特征集。所提出的通道修剪技术为降低深度卷积神经网络的计算成本同时保持高性能水平提供了一种有前途的方法。通过设计针对特定输入数据类型优化的网络结构,该方法可以产生更高效和有效的机器学习模型。修剪操作需要微调以优化网络性能,使用x射线、胸部CT和MNIST图像的实验表明,修剪后的网络可以在最小的性能下降(保持原始cnn性能在99.2%)的情况下消除大约80%的冗余通道。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Computers and Applications
International Journal of Computers and Applications Computer Science-Computer Graphics and Computer-Aided Design
CiteScore
4.70
自引率
0.00%
发文量
20
期刊介绍: The International Journal of Computers and Applications (IJCA) is a unique platform for publishing novel ideas, research outcomes and fundamental advances in all aspects of Computer Science, Computer Engineering, and Computer Applications. This is a peer-reviewed international journal with a vision to provide the academic and industrial community a platform for presenting original research ideas and applications. IJCA welcomes four special types of papers in addition to the regular research papers within its scope: (a) Papers for which all results could be easily reproducible. For such papers, the authors will be asked to upload "instructions for reproduction'''', possibly with the source codes or stable URLs (from where the codes could be downloaded). (b) Papers with negative results. For such papers, the experimental setting and negative results must be presented in detail. Also, why the negative results are important for the research community must be explained clearly. The rationale behind this kind of paper is that this would help researchers choose the correct approaches to solve problems and avoid the (already worked out) failed approaches. (c) Detailed report, case study and literature review articles about innovative software / hardware, new technology, high impact computer applications and future development with sufficient background and subject coverage. (d) Special issue papers focussing on a particular theme with significant importance or papers selected from a relevant conference with sufficient improvement and new material to differentiate from the papers published in a conference proceedings.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信