Exploring multi-granularity balance strategy for class incremental learning via three-way granular computing.

Q1 Computer Science
Yan Xian, Hong Yu, Ye Wang, Guoyin Wang
{"title":"Exploring multi-granularity balance strategy for class incremental learning via three-way granular computing.","authors":"Yan Xian, Hong Yu, Ye Wang, Guoyin Wang","doi":"10.1186/s40708-025-00255-0","DOIUrl":null,"url":null,"abstract":"<p><p>Class incremental learning (CIL) is a specific scenario in incremental learning. It aims to continuously learn new classes from the data stream, which suffers from the challenge of catastrophic forgetting. Inspired by the human hippocampus, the CIL method for replaying episodic memory offers a promising solution. However, the limited buffer budget restricts the number of old class samples that can be stored, resulting in an imbalance between new and old class samples during each incremental learning stage. This imbalance adversely affects the mitigation of catastrophic forgetting. Therefore, we propose a novel CIL method based on multi-granularity balance strategy (MGBCIL), which is inspired by the three-way granular computing in human problem-solving. In order to mitigate the adverse effects of imbalances on catastrophic forgetting at fine-, medium-, and coarse-grained levels during training, MGBCIL introduces specific strategies across the batch, task, and decision stages. Specifically, a weighted cross-entropy loss function with a smoothing factor is proposed for batch processing. In the process of task updating and classification decision, contrastive learning with different anchor point settings is employed to promote local and global separation between new and old classes. Additionally, the knowledge distillation technology is used to preserve knowledge of the old classes. Experimental evaluations on CIFAR-10 and CIFAR-100 datasets show that MGBCIL outperforms other methods in most incremental settings. Specifically, when storing 3 exemplars on CIFAR-10 with Base2 Inc2 setting, the average accuracy is improved by up to 9.59% and the forgetting rate is reduced by up to 25.45%.</p>","PeriodicalId":37465,"journal":{"name":"Brain Informatics","volume":"12 1","pages":"7"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11914578/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brain Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s40708-025-00255-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0

Abstract

Class incremental learning (CIL) is a specific scenario in incremental learning. It aims to continuously learn new classes from the data stream, which suffers from the challenge of catastrophic forgetting. Inspired by the human hippocampus, the CIL method for replaying episodic memory offers a promising solution. However, the limited buffer budget restricts the number of old class samples that can be stored, resulting in an imbalance between new and old class samples during each incremental learning stage. This imbalance adversely affects the mitigation of catastrophic forgetting. Therefore, we propose a novel CIL method based on multi-granularity balance strategy (MGBCIL), which is inspired by the three-way granular computing in human problem-solving. In order to mitigate the adverse effects of imbalances on catastrophic forgetting at fine-, medium-, and coarse-grained levels during training, MGBCIL introduces specific strategies across the batch, task, and decision stages. Specifically, a weighted cross-entropy loss function with a smoothing factor is proposed for batch processing. In the process of task updating and classification decision, contrastive learning with different anchor point settings is employed to promote local and global separation between new and old classes. Additionally, the knowledge distillation technology is used to preserve knowledge of the old classes. Experimental evaluations on CIFAR-10 and CIFAR-100 datasets show that MGBCIL outperforms other methods in most incremental settings. Specifically, when storing 3 exemplars on CIFAR-10 with Base2 Inc2 setting, the average accuracy is improved by up to 9.59% and the forgetting rate is reduced by up to 25.45%.

探索基于三向颗粒计算的班级增量学习的多粒度平衡策略。
类增量学习(Class incremental learning, CIL)是增量学习中的一种特殊场景。它旨在不断地从数据流中学习新的课程,而这些课程面临着灾难性遗忘的挑战。受人类海马体的启发,重现情景记忆的CIL方法提供了一个有希望的解决方案。然而,有限的缓冲预算限制了可以存储的旧类样本的数量,导致在每个增量学习阶段新旧类样本之间的不平衡。这种不平衡对灾难性遗忘的缓解产生了不利影响。因此,我们提出了一种基于多粒度平衡策略(MGBCIL)的新型CIL方法,该方法的灵感来自于人类问题解决中的三向颗粒计算。为了减轻训练期间细粒度、中粒度和粗粒度级别的不平衡对灾难性遗忘的不利影响,MGBCIL在批处理、任务和决策阶段引入了特定的策略。具体来说,提出了一种带平滑因子的加权交叉熵损失函数。在任务更新和分类决策过程中,采用不同锚点设置的对比学习,促进新旧类的局部和全局分离。此外,利用知识蒸馏技术对旧类的知识进行保存。在CIFAR-10和CIFAR-100数据集上的实验评估表明,MGBCIL在大多数增量设置中优于其他方法。具体来说,当在CIFAR-10上存储3个样本时,使用Base2 Inc2设置,平均准确率提高了9.59%,遗忘率降低了25.45%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Brain Informatics
Brain Informatics Computer Science-Computer Science Applications
CiteScore
9.50
自引率
0.00%
发文量
27
审稿时长
13 weeks
期刊介绍: Brain Informatics is an international, peer-reviewed, interdisciplinary open-access journal published under the brand SpringerOpen, which provides a unique platform for researchers and practitioners to disseminate original research on computational and informatics technologies related to brain. This journal addresses the computational, cognitive, physiological, biological, physical, ecological and social perspectives of brain informatics. It also welcomes emerging information technologies and advanced neuro-imaging technologies, such as big data analytics and interactive knowledge discovery related to various large-scale brain studies and their applications. This journal will publish high-quality original research papers, brief reports and critical reviews in all theoretical, technological, clinical and interdisciplinary studies that make up the field of brain informatics and its applications in brain-machine intelligence, brain-inspired intelligent systems, mental health and brain disorders, etc. The scope of papers includes the following five tracks: Track 1: Cognitive and Computational Foundations of Brain Science Track 2: Human Information Processing Systems Track 3: Brain Big Data Analytics, Curation and Management Track 4: Informatics Paradigms for Brain and Mental Health Research Track 5: Brain-Machine Intelligence and Brain-Inspired Computing
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信