Gradient-Based Meta-Learning Using Adaptive Multiple Loss Weighting and Homoscedastic Uncertainty

Lin Ding, Wenfeng Shen, Weijia Lu, Peng Liu, Shengbo Chen, Sisi Chen
{"title":"Gradient-Based Meta-Learning Using Adaptive Multiple Loss Weighting and Homoscedastic Uncertainty","authors":"Lin Ding, Wenfeng Shen, Weijia Lu, Peng Liu, Shengbo Chen, Sisi Chen","doi":"10.1109/ICCECE58074.2023.10135472","DOIUrl":null,"url":null,"abstract":"Model-agnostic meta-learning schemes adopt gradient descent to learn task commonalities and obtain the initialization parameters of the meta-model to rapidly adjust to new tasks with only a few training samples. Therefore, such schemes have become the mainstream meta-learning approach for studying few shot learning problems. This study mainly addresses the challenge of task uncertainty in few-shot learning and proposes an improved meta-learning approach, which first enables a task specific learner to select the initial parameter that minimize the loss of a new task, then generates weights by comparing meta-loss differences, and finally leads into the homoscedastic uncertainty of the task to weight the diverse losses. Our model conducts superior on few shot learning task than previous meta learning approach and improves its robustness regardless of the initial learning rates and query sets.","PeriodicalId":120030,"journal":{"name":"2023 3rd International Conference on Consumer Electronics and Computer Engineering (ICCECE)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 3rd International Conference on Consumer Electronics and Computer Engineering (ICCECE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCECE58074.2023.10135472","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Model-agnostic meta-learning schemes adopt gradient descent to learn task commonalities and obtain the initialization parameters of the meta-model to rapidly adjust to new tasks with only a few training samples. Therefore, such schemes have become the mainstream meta-learning approach for studying few shot learning problems. This study mainly addresses the challenge of task uncertainty in few-shot learning and proposes an improved meta-learning approach, which first enables a task specific learner to select the initial parameter that minimize the loss of a new task, then generates weights by comparing meta-loss differences, and finally leads into the homoscedastic uncertainty of the task to weight the diverse losses. Our model conducts superior on few shot learning task than previous meta learning approach and improves its robustness regardless of the initial learning rates and query sets.
基于自适应多重损失加权和均方差不确定性的梯度元学习
与模型无关的元学习方案采用梯度下降法学习任务共性,获得元模型的初始化参数,在训练样本较少的情况下快速适应新任务。因此,这些方案已经成为研究少数镜头学习问题的主流元学习方法。本研究主要解决了少次学习中任务不确定性的挑战,提出了一种改进的元学习方法,该方法首先使特定任务的学习器选择使新任务损失最小的初始参数,然后通过比较元损失差异生成权值,最后引入任务的同方差不确定性来对各种损失进行加权。无论初始学习率和查询集如何,我们的模型都比以前的元学习方法在少镜头学习任务上表现得更好,并提高了其鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信