{"title":"基于DenseNet网络的图像分类剪枝方法","authors":"Ruikang Ju, Ting-Yu Lin, Jen-Shiun Chiang","doi":"10.1109/taai54685.2021.00028","DOIUrl":null,"url":null,"abstract":"Deep neural networks have made significant progress in the field of computer vision. Recent works have shown that depth, width and shortcut connections of the neural network architectures play a crucial role in their performance. As one of the most advanced neural network architectures, DenseNet, which achieves excellent convergence speed through dense connections. However, it still has obvious shortcomings in the use of memory. In this paper, we introduce two new pruning methods using threshold, which refers to the concept of threshold voltage in MOSFET. Now we have implemented one of the pruning methods. This work uses this method to connect blocks of different depths in different ways to reduce memory usage. We name the proposed network ThresholdNet, evaluate it and other different networks on two datasets (CIFAR-10 and STL-10). Experiments show that the proposed method is 60% faster than DenseNet, 20% faster and 10% lower error rate than HarDNet.","PeriodicalId":343821,"journal":{"name":"2021 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"New Pruning Method Based on DenseNet Network for Image Classification\",\"authors\":\"Ruikang Ju, Ting-Yu Lin, Jen-Shiun Chiang\",\"doi\":\"10.1109/taai54685.2021.00028\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep neural networks have made significant progress in the field of computer vision. Recent works have shown that depth, width and shortcut connections of the neural network architectures play a crucial role in their performance. As one of the most advanced neural network architectures, DenseNet, which achieves excellent convergence speed through dense connections. However, it still has obvious shortcomings in the use of memory. In this paper, we introduce two new pruning methods using threshold, which refers to the concept of threshold voltage in MOSFET. Now we have implemented one of the pruning methods. This work uses this method to connect blocks of different depths in different ways to reduce memory usage. We name the proposed network ThresholdNet, evaluate it and other different networks on two datasets (CIFAR-10 and STL-10). Experiments show that the proposed method is 60% faster than DenseNet, 20% faster and 10% lower error rate than HarDNet.\",\"PeriodicalId\":343821,\"journal\":{\"name\":\"2021 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/taai54685.2021.00028\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Technologies and Applications of Artificial Intelligence (TAAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/taai54685.2021.00028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
New Pruning Method Based on DenseNet Network for Image Classification
Deep neural networks have made significant progress in the field of computer vision. Recent works have shown that depth, width and shortcut connections of the neural network architectures play a crucial role in their performance. As one of the most advanced neural network architectures, DenseNet, which achieves excellent convergence speed through dense connections. However, it still has obvious shortcomings in the use of memory. In this paper, we introduce two new pruning methods using threshold, which refers to the concept of threshold voltage in MOSFET. Now we have implemented one of the pruning methods. This work uses this method to connect blocks of different depths in different ways to reduce memory usage. We name the proposed network ThresholdNet, evaluate it and other different networks on two datasets (CIFAR-10 and STL-10). Experiments show that the proposed method is 60% faster than DenseNet, 20% faster and 10% lower error rate than HarDNet.