{"title":"一种快速正则化方法增强两阶段分类器的少次类增量学习","authors":"Meilan Hao , Yizhan Gu , Kejian Dong , Prayag Tiwari , Xiaoqing Lv , Xin Ning","doi":"10.1016/j.neunet.2025.107453","DOIUrl":null,"url":null,"abstract":"<div><div>With a limited number of labeled samples, Few-Shot Class-Incremental Learning (FSCIL) seeks to efficiently train and update models without forgetting previously learned tasks. Because pre-trained models can learn extensive feature representations from big existing datasets, they offer strong knowledge foundations and transferability, which makes them useful in both few-shot and incremental learning scenarios. Additionally, Prompt Learning improves pre-trained deep learning models’ performance on downstream tasks, particularly in large-scale language or vision models. In this paper, we propose a novel Prompt Regularization (PrRe) approach to maximize the fusion of prompts by embedding two different prompts, the Task Prompt and the Global Prompt, inside a pre-trained Vision Transformer (ViT). In the classification phase, we propose a Two-Stage Classifier (TSC), utilizing K-Nearest Neighbors for base session and a Prototype Classifier for incremental sessions, integrated with a global self-attention module. Through experiments on multiple benchmark tests, we demonstrate the effectiveness and superiority of our method. The code is available at <span><span>https://github.com/gyzzzzzzzz/PrRe</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107453"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A prompt regularization approach to enhance few-shot class-incremental learning with Two-Stage Classifier\",\"authors\":\"Meilan Hao , Yizhan Gu , Kejian Dong , Prayag Tiwari , Xiaoqing Lv , Xin Ning\",\"doi\":\"10.1016/j.neunet.2025.107453\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>With a limited number of labeled samples, Few-Shot Class-Incremental Learning (FSCIL) seeks to efficiently train and update models without forgetting previously learned tasks. Because pre-trained models can learn extensive feature representations from big existing datasets, they offer strong knowledge foundations and transferability, which makes them useful in both few-shot and incremental learning scenarios. Additionally, Prompt Learning improves pre-trained deep learning models’ performance on downstream tasks, particularly in large-scale language or vision models. In this paper, we propose a novel Prompt Regularization (PrRe) approach to maximize the fusion of prompts by embedding two different prompts, the Task Prompt and the Global Prompt, inside a pre-trained Vision Transformer (ViT). In the classification phase, we propose a Two-Stage Classifier (TSC), utilizing K-Nearest Neighbors for base session and a Prototype Classifier for incremental sessions, integrated with a global self-attention module. Through experiments on multiple benchmark tests, we demonstrate the effectiveness and superiority of our method. The code is available at <span><span>https://github.com/gyzzzzzzzz/PrRe</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"188 \",\"pages\":\"Article 107453\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-04-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025003326\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003326","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A prompt regularization approach to enhance few-shot class-incremental learning with Two-Stage Classifier
With a limited number of labeled samples, Few-Shot Class-Incremental Learning (FSCIL) seeks to efficiently train and update models without forgetting previously learned tasks. Because pre-trained models can learn extensive feature representations from big existing datasets, they offer strong knowledge foundations and transferability, which makes them useful in both few-shot and incremental learning scenarios. Additionally, Prompt Learning improves pre-trained deep learning models’ performance on downstream tasks, particularly in large-scale language or vision models. In this paper, we propose a novel Prompt Regularization (PrRe) approach to maximize the fusion of prompts by embedding two different prompts, the Task Prompt and the Global Prompt, inside a pre-trained Vision Transformer (ViT). In the classification phase, we propose a Two-Stage Classifier (TSC), utilizing K-Nearest Neighbors for base session and a Prototype Classifier for incremental sessions, integrated with a global self-attention module. Through experiments on multiple benchmark tests, we demonstrate the effectiveness and superiority of our method. The code is available at https://github.com/gyzzzzzzzz/PrRe.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.