EMAS: Efficient Meta Architecture Search for Few-Shot Learning

Dongkai Liu, Jiaxing Li, Honglong Chen, Baodi Liu, Xiaoping Lu, Weifeng Liu
{"title":"EMAS: Efficient Meta Architecture Search for Few-Shot Learning","authors":"Dongkai Liu, Jiaxing Li, Honglong Chen, Baodi Liu, Xiaoping Lu, Weifeng Liu","doi":"10.1109/ICTAI56018.2022.00099","DOIUrl":null,"url":null,"abstract":"With the progress of few-shot learning, it has been scaled to many domains in the real world which have few labeled data, such as image classification and object detection. Many efforts for data embedding and feature combination have been made by designing a fixed neural architecture that can also be extended to variable and adaptive neural architectures for better performance. Recent works leverage neural architecture search technique to automatically design networks for few-shot learning but it requires vast computation costs and GPU memory requirements. This work introduces EMAS, an efficient method to speed up the searching process for few-shot learning. Specifically, we build a supernet to combine all candidate operations and then adopt gradient-based methods to search. Instead of training the whole supernet, we adopt Gumbel reparameterization technique to sample and activate a small subset of operations. EMAS handles a single path in a novel task adapted with just a few steps and time. A novel task only needs to learn fewer parameters and compute less content. During meta-testing, the task can well adapt to the network architecture although only with a few iterations. Empirical results show that EMAS yields a fair improvement in accuracy on the standard few-shot classification benchmark and is five times smaller in time.","PeriodicalId":354314,"journal":{"name":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI56018.2022.00099","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the progress of few-shot learning, it has been scaled to many domains in the real world which have few labeled data, such as image classification and object detection. Many efforts for data embedding and feature combination have been made by designing a fixed neural architecture that can also be extended to variable and adaptive neural architectures for better performance. Recent works leverage neural architecture search technique to automatically design networks for few-shot learning but it requires vast computation costs and GPU memory requirements. This work introduces EMAS, an efficient method to speed up the searching process for few-shot learning. Specifically, we build a supernet to combine all candidate operations and then adopt gradient-based methods to search. Instead of training the whole supernet, we adopt Gumbel reparameterization technique to sample and activate a small subset of operations. EMAS handles a single path in a novel task adapted with just a few steps and time. A novel task only needs to learn fewer parameters and compute less content. During meta-testing, the task can well adapt to the network architecture although only with a few iterations. Empirical results show that EMAS yields a fair improvement in accuracy on the standard few-shot classification benchmark and is five times smaller in time.
EMAS:高效元架构搜索的少镜头学习
随着few-shot学习的进步,它已经扩展到现实世界中许多标记数据较少的领域,如图像分类和目标检测。在数据嵌入和特征组合方面做了很多努力,设计了一个固定的神经结构,也可以扩展到可变和自适应神经结构,以获得更好的性能。最近的工作利用神经架构搜索技术来自动设计网络进行少量学习,但它需要大量的计算成本和GPU内存要求。本文介绍了一种快速搜索算法EMAS,它可以有效地加快少镜头学习的搜索过程。具体来说,我们建立了一个超级网络来组合所有候选操作,然后采用基于梯度的方法进行搜索。我们采用Gumbel重新参数化技术对一小部分操作进行采样和激活,而不是对整个超级网络进行训练。EMAS处理新任务中的单个路径,只需几个步骤和时间即可适应。一个新的任务只需要学习更少的参数和计算更少的内容。在元测试期间,任务可以很好地适应网络架构,尽管只需要少量的迭代。实证结果表明,EMAS在标准少弹分类基准上的准确率有了相当大的提高,并且时间缩短了五倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信