{"title":"利用相关拓扑的相似性诱导冗余进行深度卷积神经网络的通道剪枝","authors":"J. Liu, H. Shao, X. Deng, Y. T. Jiang","doi":"10.1080/1206212X.2023.2218061","DOIUrl":null,"url":null,"abstract":"The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).","PeriodicalId":39673,"journal":{"name":"International Journal of Computers and Applications","volume":"95 1","pages":"379 - 390"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploiting similarity-induced redundancies in correlation topology for channel pruning in deep convolutional neural networks\",\"authors\":\"J. Liu, H. Shao, X. Deng, Y. T. Jiang\",\"doi\":\"10.1080/1206212X.2023.2218061\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).\",\"PeriodicalId\":39673,\"journal\":{\"name\":\"International Journal of Computers and Applications\",\"volume\":\"95 1\",\"pages\":\"379 - 390\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computers and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/1206212X.2023.2218061\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computers and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/1206212X.2023.2218061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
Exploiting similarity-induced redundancies in correlation topology for channel pruning in deep convolutional neural networks
The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).
期刊介绍:
The International Journal of Computers and Applications (IJCA) is a unique platform for publishing novel ideas, research outcomes and fundamental advances in all aspects of Computer Science, Computer Engineering, and Computer Applications. This is a peer-reviewed international journal with a vision to provide the academic and industrial community a platform for presenting original research ideas and applications. IJCA welcomes four special types of papers in addition to the regular research papers within its scope: (a) Papers for which all results could be easily reproducible. For such papers, the authors will be asked to upload "instructions for reproduction'''', possibly with the source codes or stable URLs (from where the codes could be downloaded). (b) Papers with negative results. For such papers, the experimental setting and negative results must be presented in detail. Also, why the negative results are important for the research community must be explained clearly. The rationale behind this kind of paper is that this would help researchers choose the correct approaches to solve problems and avoid the (already worked out) failed approaches. (c) Detailed report, case study and literature review articles about innovative software / hardware, new technology, high impact computer applications and future development with sufficient background and subject coverage. (d) Special issue papers focussing on a particular theme with significant importance or papers selected from a relevant conference with sufficient improvement and new material to differentiate from the papers published in a conference proceedings.