Connection Reduction of DenseNet for Image Recognition

Ruikang Ju, Jen-Shiun Chiang, Chih-Chia Chen, Yu-Shian Lin
{"title":"Connection Reduction of DenseNet for Image Recognition","authors":"Ruikang Ju, Jen-Shiun Chiang, Chih-Chia Chen, Yu-Shian Lin","doi":"10.1109/ISPACS57703.2022.10082842","DOIUrl":null,"url":null,"abstract":"Convolutional Neural Networks increase depth by stacking convolution layers, and deeper network models perform better in image recognition. Empirical research shows that simply stacking convolution layers does not make the network train better, and skip connection (residual learning) can improve network model performance. For the image classification tasks, models with global densely connected architectures perform well in large datasets like ImageNet, but they are not suitable for small datasets such as CIFAR-10 and SVHN. Different from dense connections, we propose two new algorithms to connect layers in this paper. Baseline is a densely connected network, and the networks connected by the two new algorithms are named ShortNet1 and ShortNet2, respectively. The experimental results of image classification on CIFAR-10 and SVHN show that ShortNet1has a 5% lower test error rate and 25% faster inference time than Baseline. ShortNet2 speeds up inference time by 40% with less loss in test accuracy. Code and pretrained models are available at https://github.com/RuiyangJu/Connection_Reduction/","PeriodicalId":410603,"journal":{"name":"2022 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPACS57703.2022.10082842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Convolutional Neural Networks increase depth by stacking convolution layers, and deeper network models perform better in image recognition. Empirical research shows that simply stacking convolution layers does not make the network train better, and skip connection (residual learning) can improve network model performance. For the image classification tasks, models with global densely connected architectures perform well in large datasets like ImageNet, but they are not suitable for small datasets such as CIFAR-10 and SVHN. Different from dense connections, we propose two new algorithms to connect layers in this paper. Baseline is a densely connected network, and the networks connected by the two new algorithms are named ShortNet1 and ShortNet2, respectively. The experimental results of image classification on CIFAR-10 and SVHN show that ShortNet1has a 5% lower test error rate and 25% faster inference time than Baseline. ShortNet2 speeds up inference time by 40% with less loss in test accuracy. Code and pretrained models are available at https://github.com/RuiyangJu/Connection_Reduction/
用于图像识别的DenseNet连接缩减
卷积神经网络通过叠加卷积层来增加深度,深度越深的网络模型在图像识别中表现越好。实证研究表明,简单地叠加卷积层并不能使网络训练更好,而跳过连接(残差学习)可以提高网络模型的性能。对于图像分类任务,具有全局密集连接架构的模型在ImageNet等大数据集上表现良好,但不适合CIFAR-10和SVHN等小数据集。与密集连接不同,本文提出了两种新的层连接算法。基线是一个密集连接的网络,两种新算法连接的网络分别命名为ShortNet1和ShortNet2。在CIFAR-10和SVHN上的图像分类实验结果表明,shortnet1的测试错误率比Baseline低5%,推理时间比Baseline快25%。ShortNet2将推理时间提高了40%,测试精度损失更小。代码和预训练模型可在https://github.com/RuiyangJu/Connection_Reduction/上获得
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信