Reliable Few-Shot Learning Under Dual Noises

IF 18.6
Ji Zhang;Jingkuan Song;Lianli Gao;Nicu Sebe;Heng Tao Shen
{"title":"Reliable Few-Shot Learning Under Dual Noises","authors":"Ji Zhang;Jingkuan Song;Lianli Gao;Nicu Sebe;Heng Tao Shen","doi":"10.1109/TPAMI.2025.3584051","DOIUrl":null,"url":null,"abstract":"Recent advances in model pre-training give rise to task adaptation-based few-shot learning (FSL), where the goal is to adapt a pre-trained task-agnostic model for capturing task-specific knowledge with a few-labeled support samples of the target task. Nevertheless, existing approaches may still fail in the open world due to the inevitable <italic>in-distribution (ID)</i> and <italic>out-of-distribution (OOD)</i> noise from both support and query samples of the target task. With limited support samples available, <italic>i</i>) the adverse effect of the dual noises can be severely amplified during task adaptation, and <italic>ii</i>) the adapted model can produce unreliable predictions on query samples in the presence of the dual noises. In this work, we propose <bold>DE</b>noised <bold>T</b>ask <bold>A</b>daptation (<bold>DETA</b>++) for reliable FSL. DETA++ uses a Contrastive Relevance Aggregation (CoRA) module to calculate image and region weights for support samples, based on which a <italic>clean prototype</i> loss and a <italic>noise entropy maximization</i> loss are proposed to achieve noise-robust task adaptation. Additionally, DETA++ employs a memory bank to store and refine clean regions for each inner-task class, based on which a Local Nearest Centroid Classifier (LocalNCC) is devised to yield noise-robust predictions on query samples. Moreover, DETA++ utilizes an Intra-class Region Swapping (IntraSwap) strategy to rectify ID class prototypes during task adaptation, enhancing the model’s robustness to the dual noises. Extensive experiments demonstrate the effectiveness and flexibility of DETA++.","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"47 10","pages":"9005-9022"},"PeriodicalIF":18.6000,"publicationDate":"2025-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11059315/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advances in model pre-training give rise to task adaptation-based few-shot learning (FSL), where the goal is to adapt a pre-trained task-agnostic model for capturing task-specific knowledge with a few-labeled support samples of the target task. Nevertheless, existing approaches may still fail in the open world due to the inevitable in-distribution (ID) and out-of-distribution (OOD) noise from both support and query samples of the target task. With limited support samples available, i) the adverse effect of the dual noises can be severely amplified during task adaptation, and ii) the adapted model can produce unreliable predictions on query samples in the presence of the dual noises. In this work, we propose DEnoised Task Adaptation (DETA++) for reliable FSL. DETA++ uses a Contrastive Relevance Aggregation (CoRA) module to calculate image and region weights for support samples, based on which a clean prototype loss and a noise entropy maximization loss are proposed to achieve noise-robust task adaptation. Additionally, DETA++ employs a memory bank to store and refine clean regions for each inner-task class, based on which a Local Nearest Centroid Classifier (LocalNCC) is devised to yield noise-robust predictions on query samples. Moreover, DETA++ utilizes an Intra-class Region Swapping (IntraSwap) strategy to rectify ID class prototypes during task adaptation, enhancing the model’s robustness to the dual noises. Extensive experiments demonstrate the effectiveness and flexibility of DETA++.
双噪声下可靠的少次学习
模型预训练的最新进展产生了基于任务自适应的少次学习(FSL),其目标是使用目标任务的少量标记支持样本来调整预训练的任务不确定模型,以捕获特定于任务的知识。然而,由于目标任务的支持样本和查询样本不可避免地存在分布内(ID)和分布外(OOD)噪声,现有的方法在开放世界中仍然可能失败。在支持样本有限的情况下,在任务自适应过程中,双重噪声的不利影响会被严重放大;在双重噪声存在的情况下,自适应模型会对查询样本产生不可靠的预测。在这项工作中,我们提出了去噪任务自适应(deta++)用于可靠的FSL。deta++使用对比关联聚合(CoRA)模块计算支持样本的图像和区域权重,在此基础上提出干净原型损失和噪声熵最大化损失来实现噪声鲁棒任务自适应。此外,deta++使用一个内存库来存储和优化每个内部任务类的干净区域,在此基础上设计了一个本地最接近质心分类器(LocalNCC),以产生对查询样本的噪声鲁棒预测。此外,deta++在任务适应过程中利用类内区域交换(IntraSwap)策略对ID类原型进行校正,增强了模型对双噪声的鲁棒性。大量的实验证明了deta++的有效性和灵活性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信