Piyush Bhandari, Rakesh Kumar Bijarniya, Subhamoy Chatterjee, M. Kolekar
{"title":"Analysis for Self-taught and Transfer Learning Based Approaches for Emotion Recognition","authors":"Piyush Bhandari, Rakesh Kumar Bijarniya, Subhamoy Chatterjee, M. Kolekar","doi":"10.1109/SPIN.2018.8474199","DOIUrl":null,"url":null,"abstract":"Using a deep Learning approach for any classification task demands the availability of a large labeled dataset. Such datasets are not only hard to find but also quite tedious to generate. Whereas unlabeled and un-organized sets of information is largely available in the world wide web. Especially in emotion or expression recognition, quality datasets, which are organized and freely available are very limited in number. In order to tackle such problems of small sized dataset, we analyze approaches such as self-taught and transfer learning for the expression classification, along with the extent to which the weights are transferable to the expression classification task. The base model for both types of learning is trained using the cifar10 database. We do not assume that the base data follows the same class labels or generative distribution as the test data. For testing our algorithm we use JAFFE dataset and draw inferences from the results obtained. We document that self taught learning forces the neural network to settle at a local minima rather than the global minimum. Transfer learning outperforms self taught learning and we also observe the correlation between the layers of a deep network when transferring the weights in a layer wise fashion.","PeriodicalId":184596,"journal":{"name":"2018 5th International Conference on Signal Processing and Integrated Networks (SPIN)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 5th International Conference on Signal Processing and Integrated Networks (SPIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPIN.2018.8474199","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Using a deep Learning approach for any classification task demands the availability of a large labeled dataset. Such datasets are not only hard to find but also quite tedious to generate. Whereas unlabeled and un-organized sets of information is largely available in the world wide web. Especially in emotion or expression recognition, quality datasets, which are organized and freely available are very limited in number. In order to tackle such problems of small sized dataset, we analyze approaches such as self-taught and transfer learning for the expression classification, along with the extent to which the weights are transferable to the expression classification task. The base model for both types of learning is trained using the cifar10 database. We do not assume that the base data follows the same class labels or generative distribution as the test data. For testing our algorithm we use JAFFE dataset and draw inferences from the results obtained. We document that self taught learning forces the neural network to settle at a local minima rather than the global minimum. Transfer learning outperforms self taught learning and we also observe the correlation between the layers of a deep network when transferring the weights in a layer wise fashion.