Wenjing Yang , Haoang Chi , Yibing Zhan , Bowen Hu , Xiaoguang Ren , Dapeng Tao , Long Lan
{"title":"NT-FAN:一种简单而有效的耐噪少射自适应网络","authors":"Wenjing Yang , Haoang Chi , Yibing Zhan , Bowen Hu , Xiaoguang Ren , Dapeng Tao , Long Lan","doi":"10.1016/j.artint.2025.104363","DOIUrl":null,"url":null,"abstract":"<div><div><em>Few-shot domain adaptation</em> (FDA) aims to train a target model with <em>clean</em> labeled data from the source domain and <em>few</em> labeled data from the target domain. Given a limited annotation budget, source data may contain many noisy labels, which can detrimentally impact the performance of models in real-world applications. This problem setting is denoted as <em>wildly few-shot domain adaptation</em> (WFDA), simultaneously taking care of label noise and data shortage. While previous studies have achieved some success, they typically rely on multiple adaptation models to collaboratively filter noisy labels, resulting in substantial computational overhead. To address WFDA more simply and elegantly, we offer a theoretical analysis of this problem and propose a comprehensive upper bound for the excess risk on the target domain. Our theoretical result reveals that correct domain-invariant representations can be obtained even in the presence of source noise and limited target data without incurring additional costs. In response, we propose a simple yet effective WFDA method, referred to as <em>noise-tolerant few-shot adaptation network</em> (NT-FAN). Experiments demonstrate that our method significantly outperforms all the state-of-the-art competitors while maintaining a more <em>lightweight</em> architecture. Notably, NT-FAN consistently exhibits robust performance when dealing with more realistic and intractable source noise (e.g., instance-dependent label noise) and severe source noise (e.g., a 40% noise rate) in the source domain.</div></div>","PeriodicalId":8434,"journal":{"name":"Artificial Intelligence","volume":"346 ","pages":"Article 104363"},"PeriodicalIF":4.6000,"publicationDate":"2025-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"NT-FAN: A simple yet effective noise-tolerant few-shot adaptation network\",\"authors\":\"Wenjing Yang , Haoang Chi , Yibing Zhan , Bowen Hu , Xiaoguang Ren , Dapeng Tao , Long Lan\",\"doi\":\"10.1016/j.artint.2025.104363\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div><em>Few-shot domain adaptation</em> (FDA) aims to train a target model with <em>clean</em> labeled data from the source domain and <em>few</em> labeled data from the target domain. Given a limited annotation budget, source data may contain many noisy labels, which can detrimentally impact the performance of models in real-world applications. This problem setting is denoted as <em>wildly few-shot domain adaptation</em> (WFDA), simultaneously taking care of label noise and data shortage. While previous studies have achieved some success, they typically rely on multiple adaptation models to collaboratively filter noisy labels, resulting in substantial computational overhead. To address WFDA more simply and elegantly, we offer a theoretical analysis of this problem and propose a comprehensive upper bound for the excess risk on the target domain. Our theoretical result reveals that correct domain-invariant representations can be obtained even in the presence of source noise and limited target data without incurring additional costs. In response, we propose a simple yet effective WFDA method, referred to as <em>noise-tolerant few-shot adaptation network</em> (NT-FAN). Experiments demonstrate that our method significantly outperforms all the state-of-the-art competitors while maintaining a more <em>lightweight</em> architecture. Notably, NT-FAN consistently exhibits robust performance when dealing with more realistic and intractable source noise (e.g., instance-dependent label noise) and severe source noise (e.g., a 40% noise rate) in the source domain.</div></div>\",\"PeriodicalId\":8434,\"journal\":{\"name\":\"Artificial Intelligence\",\"volume\":\"346 \",\"pages\":\"Article 104363\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-05-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Artificial Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0004370225000827\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0004370225000827","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
NT-FAN: A simple yet effective noise-tolerant few-shot adaptation network
Few-shot domain adaptation (FDA) aims to train a target model with clean labeled data from the source domain and few labeled data from the target domain. Given a limited annotation budget, source data may contain many noisy labels, which can detrimentally impact the performance of models in real-world applications. This problem setting is denoted as wildly few-shot domain adaptation (WFDA), simultaneously taking care of label noise and data shortage. While previous studies have achieved some success, they typically rely on multiple adaptation models to collaboratively filter noisy labels, resulting in substantial computational overhead. To address WFDA more simply and elegantly, we offer a theoretical analysis of this problem and propose a comprehensive upper bound for the excess risk on the target domain. Our theoretical result reveals that correct domain-invariant representations can be obtained even in the presence of source noise and limited target data without incurring additional costs. In response, we propose a simple yet effective WFDA method, referred to as noise-tolerant few-shot adaptation network (NT-FAN). Experiments demonstrate that our method significantly outperforms all the state-of-the-art competitors while maintaining a more lightweight architecture. Notably, NT-FAN consistently exhibits robust performance when dealing with more realistic and intractable source noise (e.g., instance-dependent label noise) and severe source noise (e.g., a 40% noise rate) in the source domain.
期刊介绍:
The Journal of Artificial Intelligence (AIJ) welcomes papers covering a broad spectrum of AI topics, including cognition, automated reasoning, computer vision, machine learning, and more. Papers should demonstrate advancements in AI and propose innovative approaches to AI problems. Additionally, the journal accepts papers describing AI applications, focusing on how new methods enhance performance rather than reiterating conventional approaches. In addition to regular papers, AIJ also accepts Research Notes, Research Field Reviews, Position Papers, Book Reviews, and summary papers on AI challenges and competitions.