联合优化分类器,实现少镜头分类增量学习

IF 5.3 3区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Sichao Fu;Qinmu Peng;Xiaorui Wang;Yang He;Wenhao Qiu;Bin Zou;Duanquan Xu;Xiao-Yuan Jing;Xinge You
{"title":"联合优化分类器,实现少镜头分类增量学习","authors":"Sichao Fu;Qinmu Peng;Xiaorui Wang;Yang He;Wenhao Qiu;Bin Zou;Duanquan Xu;Xiao-Yuan Jing;Xinge You","doi":"10.1109/TETCI.2024.3375509","DOIUrl":null,"url":null,"abstract":"Few-shot class-incremental learning (FSCIL) has recently aroused widespread research interest, which aims to continually learn new class knowledge from a few labeled samples without ignoring the previous concept. One typical method is graph-based FSCIL (GFSCIL), which tends to design more complex message-passing schemes to make the classifiers' decision boundary clearer. However, it would result in poor extrapolating ability because no effort was paid to consider the effectiveness of the trained feature backbone and the learned topology structure. In this paper, we propose a simple and effective GFSCIL framework to solve the above-mentioned problem, termed Jointly Optimized Classifiers (JOC). Specifically, a simple multi-task training module incorporates both classification and auxiliary task loss to jointly supervise the feature backbone trained on the base classes. By doing so, our proposed JOC can effectively improve the robustness of the trained feature backbone, without the utilization of extra datasets or complex feature backbones. To avoid new class overfitting and old class knowledge forgetting issues of the trained feature backbone, the decouple learning strategy is adopted to fix the feature backbone parameters and only optimize the classifier parameters for the new classes. Finally, a spatial-channel graph attention network is designed to simultaneously preserve the global and local similar relationships between all classes for improving the generalization performance of classifiers. To demonstrate the effectiveness of the proposed method, extensive experiments were conducted on three popular datasets. Experimental results show that our proposed JOC outperforms many existing state-of-the-art FSCIL.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"8 5","pages":"3316-3326"},"PeriodicalIF":5.3000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Jointly Optimized Classifiers for Few-Shot Class-Incremental Learning\",\"authors\":\"Sichao Fu;Qinmu Peng;Xiaorui Wang;Yang He;Wenhao Qiu;Bin Zou;Duanquan Xu;Xiao-Yuan Jing;Xinge You\",\"doi\":\"10.1109/TETCI.2024.3375509\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Few-shot class-incremental learning (FSCIL) has recently aroused widespread research interest, which aims to continually learn new class knowledge from a few labeled samples without ignoring the previous concept. One typical method is graph-based FSCIL (GFSCIL), which tends to design more complex message-passing schemes to make the classifiers' decision boundary clearer. However, it would result in poor extrapolating ability because no effort was paid to consider the effectiveness of the trained feature backbone and the learned topology structure. In this paper, we propose a simple and effective GFSCIL framework to solve the above-mentioned problem, termed Jointly Optimized Classifiers (JOC). Specifically, a simple multi-task training module incorporates both classification and auxiliary task loss to jointly supervise the feature backbone trained on the base classes. By doing so, our proposed JOC can effectively improve the robustness of the trained feature backbone, without the utilization of extra datasets or complex feature backbones. To avoid new class overfitting and old class knowledge forgetting issues of the trained feature backbone, the decouple learning strategy is adopted to fix the feature backbone parameters and only optimize the classifier parameters for the new classes. Finally, a spatial-channel graph attention network is designed to simultaneously preserve the global and local similar relationships between all classes for improving the generalization performance of classifiers. To demonstrate the effectiveness of the proposed method, extensive experiments were conducted on three popular datasets. Experimental results show that our proposed JOC outperforms many existing state-of-the-art FSCIL.\",\"PeriodicalId\":13135,\"journal\":{\"name\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"volume\":\"8 5\",\"pages\":\"3316-3326\"},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2024-03-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10476618/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10476618/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

少量类增量学习(FSCIL)最近引起了广泛的研究兴趣,其目的是从少量标注样本中不断学习新的类知识,而不忽略之前的概念。一种典型的方法是基于图的 FSCIL(GFSCIL),它倾向于设计更复杂的信息传递方案,以使分类器的决策边界更清晰。然而,由于没有考虑训练特征骨干和学习拓扑结构的有效性,这种方法的推断能力较差。本文提出了一种简单有效的 GFSCIL 框架来解决上述问题,即联合优化分类器(JOC)。具体来说,一个简单的多任务训练模块结合了分类和辅助任务损失,共同监督在基础类上训练的特征骨干。通过这种方法,我们提出的 JOC 可以有效提高训练好的特征骨干的鲁棒性,而无需使用额外的数据集或复杂的特征骨干。为了避免训练好的特征骨干出现新类过拟合和旧类知识遗忘的问题,我们采用了解耦学习策略来固定特征骨干参数,只针对新类优化分类器参数。最后,设计了一个空间通道图注意网络,以同时保留所有类别之间的全局和局部相似关系,从而提高分类器的泛化性能。为了证明所提方法的有效性,我们在三个流行的数据集上进行了广泛的实验。实验结果表明,我们提出的 JOC 优于许多现有的最先进的 FSCIL。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Jointly Optimized Classifiers for Few-Shot Class-Incremental Learning
Few-shot class-incremental learning (FSCIL) has recently aroused widespread research interest, which aims to continually learn new class knowledge from a few labeled samples without ignoring the previous concept. One typical method is graph-based FSCIL (GFSCIL), which tends to design more complex message-passing schemes to make the classifiers' decision boundary clearer. However, it would result in poor extrapolating ability because no effort was paid to consider the effectiveness of the trained feature backbone and the learned topology structure. In this paper, we propose a simple and effective GFSCIL framework to solve the above-mentioned problem, termed Jointly Optimized Classifiers (JOC). Specifically, a simple multi-task training module incorporates both classification and auxiliary task loss to jointly supervise the feature backbone trained on the base classes. By doing so, our proposed JOC can effectively improve the robustness of the trained feature backbone, without the utilization of extra datasets or complex feature backbones. To avoid new class overfitting and old class knowledge forgetting issues of the trained feature backbone, the decouple learning strategy is adopted to fix the feature backbone parameters and only optimize the classifier parameters for the new classes. Finally, a spatial-channel graph attention network is designed to simultaneously preserve the global and local similar relationships between all classes for improving the generalization performance of classifiers. To demonstrate the effectiveness of the proposed method, extensive experiments were conducted on three popular datasets. Experimental results show that our proposed JOC outperforms many existing state-of-the-art FSCIL.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
10.30
自引率
7.50%
发文量
147
期刊介绍: The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys. TETCI is an electronics only publication. TETCI publishes six issues per year. Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信