Jiangdong Fan , Yuekeng Li , Jiayi Bi , Hui Xu , Jie Shao
{"title":"Task augmentation via channel mixture for few-task meta-learning","authors":"Jiangdong Fan , Yuekeng Li , Jiayi Bi , Hui Xu , Jie Shao","doi":"10.1016/j.neunet.2025.107609","DOIUrl":null,"url":null,"abstract":"<div><div>Meta-learning is a promising approach for rapidly adapting to new tasks with minimal data by leveraging knowledge from previous tasks. However, meta-learning typically requires a large number of meta-training tasks. Existing methods often generate new tasks by interpolating fine-grained feature points, and such interpolation can compromise the continuity and integrity of the feature representations in the generated tasks. To address this problem, we propose task-level data augmentation to generate additional new tasks. Specifically, we introduce a novel task augmentation method called Task Augmentation via Channel Mixture (TACM). TACM generates new tasks by mixing channels from different tasks. This channel-level mixture preserves the continuity and integrity of feature information in channels during the mixture process, thereby enhancing the generalization ability of the model. Experimental results demonstrate that TACM outperforms other state-of-the-art methods across multiple datasets. Code is available at <span><span>https://github.com/F-GOD6/TACM</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107609"},"PeriodicalIF":6.0000,"publicationDate":"2025-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025004897","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Meta-learning is a promising approach for rapidly adapting to new tasks with minimal data by leveraging knowledge from previous tasks. However, meta-learning typically requires a large number of meta-training tasks. Existing methods often generate new tasks by interpolating fine-grained feature points, and such interpolation can compromise the continuity and integrity of the feature representations in the generated tasks. To address this problem, we propose task-level data augmentation to generate additional new tasks. Specifically, we introduce a novel task augmentation method called Task Augmentation via Channel Mixture (TACM). TACM generates new tasks by mixing channels from different tasks. This channel-level mixture preserves the continuity and integrity of feature information in channels during the mixture process, thereby enhancing the generalization ability of the model. Experimental results demonstrate that TACM outperforms other state-of-the-art methods across multiple datasets. Code is available at https://github.com/F-GOD6/TACM.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.