Effects of transfer learning for handwritten digit classification in a small training sample size situation

Y. Mitani, Naoki Yamaguchi, Y. Fujita, Y. Hamamoto
{"title":"Effects of transfer learning for handwritten digit classification in a small training sample size situation","authors":"Y. Mitani, Naoki Yamaguchi, Y. Fujita, Y. Hamamoto","doi":"10.1145/3582099.3582119","DOIUrl":null,"url":null,"abstract":"A deep learning approach is believed to be one of the most useful for image pattern recognition. Generally, deep learning requires a large number of samples. In particular, it is prone to overlearning when the number of training samples is small. However, it is common for practical pattern recognition problems to use a limited number of training samples. One way to design a deep neural network with such a small number of training samples is to use transfer learning. Transfer learning, known to be pre-trained with a large number of samples, and its pre-trained neural networks are expected to be applicable to other pattern recognition problems, especially when the number of training samples is small. It is difficult to develop a handwritten character classification system because handwritten characters are not always readily available and the number of samples is generally small. In this paper, we examine effects of transfer learning for handwritten digit classification under a small number of training samples. Experimental results show that transfer learning is more effective than convolutional neural networks (CNNs) in classifying handwritten digits.","PeriodicalId":222372,"journal":{"name":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3582099.3582119","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

A deep learning approach is believed to be one of the most useful for image pattern recognition. Generally, deep learning requires a large number of samples. In particular, it is prone to overlearning when the number of training samples is small. However, it is common for practical pattern recognition problems to use a limited number of training samples. One way to design a deep neural network with such a small number of training samples is to use transfer learning. Transfer learning, known to be pre-trained with a large number of samples, and its pre-trained neural networks are expected to be applicable to other pattern recognition problems, especially when the number of training samples is small. It is difficult to develop a handwritten character classification system because handwritten characters are not always readily available and the number of samples is generally small. In this paper, we examine effects of transfer learning for handwritten digit classification under a small number of training samples. Experimental results show that transfer learning is more effective than convolutional neural networks (CNNs) in classifying handwritten digits.
小样本情况下迁移学习对手写体数字分类的影响
深度学习方法被认为是图像模式识别中最有用的方法之一。一般来说,深度学习需要大量的样本。特别是在训练样本数量较少的情况下,容易出现过度学习。然而,在实际的模式识别问题中,使用有限数量的训练样本是很常见的。用如此少的训练样本设计深度神经网络的一种方法是使用迁移学习。迁移学习,已知是用大量样本进行预训练的,其预训练的神经网络有望应用于其他模式识别问题,特别是在训练样本数量较少的情况下。由于手写字符并不总是现成的,而且样本数量通常很少,因此开发手写字符分类系统是困难的。在本文中,我们研究了在少量训练样本下迁移学习对手写体数字分类的效果。实验结果表明,在手写体数字分类中,迁移学习比卷积神经网络(cnn)更有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信