Learn from one image: Dynamic One-shot learning based on parameter generation

N. S. Kumar, M. Phirke, Anupriya Jayapal
{"title":"Learn from one image: Dynamic One-shot learning based on parameter generation","authors":"N. S. Kumar, M. Phirke, Anupriya Jayapal","doi":"10.1109/ISPACS51563.2021.9651100","DOIUrl":null,"url":null,"abstract":"State-of-the-art deep learning algorithms are usually pre-trained on datasets containing millions of images. Adding new classes to these pre-trained networks, require large number of images for each of the new classes. Formulation of such large scale datasets usually require a lot of effort and time. The aim of this paper is to develop novel deep learning based one-shot learning framework which can achieve state-of-the-art results on new classes (one-shot classes) which have only one image each during the training phase. Adding these new one-shot classes, should not degrade the performance of the model on pre-trained classes. Multi-layer transformation function has been proposed in this paper for one-shot learning, where activations of a class are converted to their corresponding parameters. The model is pre-trained on large-scale base classes and the model adapts to new classes with zero training. Experiments were conducted on opensource datasets like MiniImageNet and Pascal-VOC using Nvidia K80 GPU. The model achieves an accuracy of 93.14% for large scale base classes and 64.69% for one-shot classes which is more than 3% better than the current state-of-the-art models.","PeriodicalId":359822,"journal":{"name":"2021 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","volume":"57 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPACS51563.2021.9651100","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

State-of-the-art deep learning algorithms are usually pre-trained on datasets containing millions of images. Adding new classes to these pre-trained networks, require large number of images for each of the new classes. Formulation of such large scale datasets usually require a lot of effort and time. The aim of this paper is to develop novel deep learning based one-shot learning framework which can achieve state-of-the-art results on new classes (one-shot classes) which have only one image each during the training phase. Adding these new one-shot classes, should not degrade the performance of the model on pre-trained classes. Multi-layer transformation function has been proposed in this paper for one-shot learning, where activations of a class are converted to their corresponding parameters. The model is pre-trained on large-scale base classes and the model adapts to new classes with zero training. Experiments were conducted on opensource datasets like MiniImageNet and Pascal-VOC using Nvidia K80 GPU. The model achieves an accuracy of 93.14% for large scale base classes and 64.69% for one-shot classes which is more than 3% better than the current state-of-the-art models.
从一张图像中学习:基于参数生成的动态一次性学习
最先进的深度学习算法通常是在包含数百万图像的数据集上进行预训练的。向这些预训练的网络中添加新类,需要为每个新类提供大量的图像。这样大规模的数据集通常需要大量的精力和时间。本文的目的是开发一种新的基于单次学习的深度学习框架,该框架可以在训练阶段只有一张图像的新类(单次类)上获得最先进的结果。添加这些新的一次性类不会降低模型在预训练类上的性能。本文提出了用于单次学习的多层转换函数,将类的激活值转换为相应的参数。该模型是在大规模基类上进行预训练的,并且该模型能够适应无需训练的新类。实验采用Nvidia K80 GPU在MiniImageNet和Pascal-VOC等开源数据集上进行。该模型对大规模基类的准确率为93.14%,对单次类的准确率为64.69%,比目前最先进的模型提高了3%以上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信