Y. Mitani, Naoki Yamaguchi, Y. Fujita, Y. Hamamoto
{"title":"Effects of transfer learning for handwritten digit classification in a small training sample size situation","authors":"Y. Mitani, Naoki Yamaguchi, Y. Fujita, Y. Hamamoto","doi":"10.1145/3582099.3582119","DOIUrl":null,"url":null,"abstract":"A deep learning approach is believed to be one of the most useful for image pattern recognition. Generally, deep learning requires a large number of samples. In particular, it is prone to overlearning when the number of training samples is small. However, it is common for practical pattern recognition problems to use a limited number of training samples. One way to design a deep neural network with such a small number of training samples is to use transfer learning. Transfer learning, known to be pre-trained with a large number of samples, and its pre-trained neural networks are expected to be applicable to other pattern recognition problems, especially when the number of training samples is small. It is difficult to develop a handwritten character classification system because handwritten characters are not always readily available and the number of samples is generally small. In this paper, we examine effects of transfer learning for handwritten digit classification under a small number of training samples. Experimental results show that transfer learning is more effective than convolutional neural networks (CNNs) in classifying handwritten digits.","PeriodicalId":222372,"journal":{"name":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3582099.3582119","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A deep learning approach is believed to be one of the most useful for image pattern recognition. Generally, deep learning requires a large number of samples. In particular, it is prone to overlearning when the number of training samples is small. However, it is common for practical pattern recognition problems to use a limited number of training samples. One way to design a deep neural network with such a small number of training samples is to use transfer learning. Transfer learning, known to be pre-trained with a large number of samples, and its pre-trained neural networks are expected to be applicable to other pattern recognition problems, especially when the number of training samples is small. It is difficult to develop a handwritten character classification system because handwritten characters are not always readily available and the number of samples is generally small. In this paper, we examine effects of transfer learning for handwritten digit classification under a small number of training samples. Experimental results show that transfer learning is more effective than convolutional neural networks (CNNs) in classifying handwritten digits.