Quan Wan;Maofa Wang;Weifeng Shan;Bin Wang;Lu Zhang;Zhixiong Leng;Bingchen Yan;Yanlin Xu;Huiling Chen
{"title":"具有任务自适应选择的元学习","authors":"Quan Wan;Maofa Wang;Weifeng Shan;Bin Wang;Lu Zhang;Zhixiong Leng;Bingchen Yan;Yanlin Xu;Huiling Chen","doi":"10.1109/TCSVT.2025.3557706","DOIUrl":null,"url":null,"abstract":"The gradient-based meta-learning algorithm gains meta-learning parameters from a pool of tasks. Starting from the obtained meta-learning parameters, it can achieve better results through fast fine-tuning with only a few gradient descent updates. The two-layer meta-learning approach that shares initialization parameters has achieved good results in solving few-shot learning domain. However, in the training of multiple similar tasks in the inner layer, the difficulty and benefits of the tasks have been consistently overlooked, resulting in conflicts between tasks and ultimately compromising the model to unexpected positions. Therefore, this paper proposes a task-adaptive selection meta-learning algorithm called TSML. Specifically, we construct a task selection trainer to assess the difficulty of tasks and calculate their future benefits. Designing more optimal training strategies for each task based on difficulty and benefit, altering the current compromise in multi-task settings, and balancing the impact of tasks on meta-learning parameters. Additionally, the outer meta-parameter updating method for traditional meta-learning has been adjusted, enabling the meta-parameters to attain a better position. By doing so, we can rapidly improve the generalization and convergence of the meta-learning parameters on unknown tasks. Experimental results indicate a 2.1% improvement over the base model in the 4-conv setting, with a more pronounced effect as the neural network is progressively complexified, reaching a 4.1% improvement in resnet12.","PeriodicalId":13082,"journal":{"name":"IEEE Transactions on Circuits and Systems for Video Technology","volume":"35 9","pages":"8627-8638"},"PeriodicalIF":11.1000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Meta-Learning With Task-Adaptive Selection\",\"authors\":\"Quan Wan;Maofa Wang;Weifeng Shan;Bin Wang;Lu Zhang;Zhixiong Leng;Bingchen Yan;Yanlin Xu;Huiling Chen\",\"doi\":\"10.1109/TCSVT.2025.3557706\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The gradient-based meta-learning algorithm gains meta-learning parameters from a pool of tasks. Starting from the obtained meta-learning parameters, it can achieve better results through fast fine-tuning with only a few gradient descent updates. The two-layer meta-learning approach that shares initialization parameters has achieved good results in solving few-shot learning domain. However, in the training of multiple similar tasks in the inner layer, the difficulty and benefits of the tasks have been consistently overlooked, resulting in conflicts between tasks and ultimately compromising the model to unexpected positions. Therefore, this paper proposes a task-adaptive selection meta-learning algorithm called TSML. Specifically, we construct a task selection trainer to assess the difficulty of tasks and calculate their future benefits. Designing more optimal training strategies for each task based on difficulty and benefit, altering the current compromise in multi-task settings, and balancing the impact of tasks on meta-learning parameters. Additionally, the outer meta-parameter updating method for traditional meta-learning has been adjusted, enabling the meta-parameters to attain a better position. By doing so, we can rapidly improve the generalization and convergence of the meta-learning parameters on unknown tasks. Experimental results indicate a 2.1% improvement over the base model in the 4-conv setting, with a more pronounced effect as the neural network is progressively complexified, reaching a 4.1% improvement in resnet12.\",\"PeriodicalId\":13082,\"journal\":{\"name\":\"IEEE Transactions on Circuits and Systems for Video Technology\",\"volume\":\"35 9\",\"pages\":\"8627-8638\"},\"PeriodicalIF\":11.1000,\"publicationDate\":\"2025-04-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Circuits and Systems for Video Technology\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10948518/\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Circuits and Systems for Video Technology","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10948518/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
The gradient-based meta-learning algorithm gains meta-learning parameters from a pool of tasks. Starting from the obtained meta-learning parameters, it can achieve better results through fast fine-tuning with only a few gradient descent updates. The two-layer meta-learning approach that shares initialization parameters has achieved good results in solving few-shot learning domain. However, in the training of multiple similar tasks in the inner layer, the difficulty and benefits of the tasks have been consistently overlooked, resulting in conflicts between tasks and ultimately compromising the model to unexpected positions. Therefore, this paper proposes a task-adaptive selection meta-learning algorithm called TSML. Specifically, we construct a task selection trainer to assess the difficulty of tasks and calculate their future benefits. Designing more optimal training strategies for each task based on difficulty and benefit, altering the current compromise in multi-task settings, and balancing the impact of tasks on meta-learning parameters. Additionally, the outer meta-parameter updating method for traditional meta-learning has been adjusted, enabling the meta-parameters to attain a better position. By doing so, we can rapidly improve the generalization and convergence of the meta-learning parameters on unknown tasks. Experimental results indicate a 2.1% improvement over the base model in the 4-conv setting, with a more pronounced effect as the neural network is progressively complexified, reaching a 4.1% improvement in resnet12.
期刊介绍:
The IEEE Transactions on Circuits and Systems for Video Technology (TCSVT) is dedicated to covering all aspects of video technologies from a circuits and systems perspective. We encourage submissions of general, theoretical, and application-oriented papers related to image and video acquisition, representation, presentation, and display. Additionally, we welcome contributions in areas such as processing, filtering, and transforms; analysis and synthesis; learning and understanding; compression, transmission, communication, and networking; as well as storage, retrieval, indexing, and search. Furthermore, papers focusing on hardware and software design and implementation are highly valued. Join us in advancing the field of video technology through innovative research and insights.