{"title":"A Proposed Algorithm to Perform Few Shot Learning with different sampling sizes","authors":"Kashvi Dedhia, Mallika Konkar, Dhruvil Shah, Prachi Tawde","doi":"10.1109/ICAECC54045.2022.9716609","DOIUrl":null,"url":null,"abstract":"Often times there is scarcity when it comes to model training of a quality dataset. Sometimes the data that is available is unlabelled, sometimes very few samples are available for some classes. In these cases, few shot learning comes in handy. There are two approaches to few shot learning Data Level approach and Parameter Level approach. The paper consists of analysis of the number of training samples using parameter level approach. Two classes have been used to perform few shot learning. Meta transfer learning is being used, by initialising the parameters of convolutional neutral networks (CNN) learner model from a model trained on ImageNet. It has been performed incrementally on datasets of various sizes. The results and performance of all the models are compared to the results when the entire dataset is used. As well as the advantages of using few shot learning. It has found its applications in a wide range of fields mainly computer vision, natural language processing etc.","PeriodicalId":199351,"journal":{"name":"2022 IEEE Fourth International Conference on Advances in Electronics, Computers and Communications (ICAECC)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Fourth International Conference on Advances in Electronics, Computers and Communications (ICAECC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAECC54045.2022.9716609","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Often times there is scarcity when it comes to model training of a quality dataset. Sometimes the data that is available is unlabelled, sometimes very few samples are available for some classes. In these cases, few shot learning comes in handy. There are two approaches to few shot learning Data Level approach and Parameter Level approach. The paper consists of analysis of the number of training samples using parameter level approach. Two classes have been used to perform few shot learning. Meta transfer learning is being used, by initialising the parameters of convolutional neutral networks (CNN) learner model from a model trained on ImageNet. It has been performed incrementally on datasets of various sizes. The results and performance of all the models are compared to the results when the entire dataset is used. As well as the advantages of using few shot learning. It has found its applications in a wide range of fields mainly computer vision, natural language processing etc.