ALRS: An Attention Loss Function based on Row-Sparsity for Incremental Learning

Yingying Xia, Bo Lu, Jianmin Ji
{"title":"ALRS: An Attention Loss Function based on Row-Sparsity for Incremental Learning","authors":"Yingying Xia, Bo Lu, Jianmin Ji","doi":"10.1109/ICCCS57501.2023.10151382","DOIUrl":null,"url":null,"abstract":"Incremental learning has received significant attention, but the problem of catastrophic forgetting remains a major challenge for existing approaches. This issue hinders models from accumulating knowledge over long stretches. To address this problem, we propose a new approach called Attention Loss function based on Row-Sparsity (ALRS) that mines significant patches by simultaneously learning patch weights and logits (class vectors) using the same parameters. We integrate the attention mechanism with the novel loss function to avoid catastrophic forgetting. This innovative approach enables the model to conflate the newly introduced classes with the existing ones, without the need to store any data or models from the previous steps' base classes. To assess its efficacy, we incorporate ALRS into the distillation loss for validation and conduct a thorough evaluation of the approach's performance on three datasets: CIFAR-100, Caltech-101, and CUBS-200-2011. Compared to LWM, which also does not store data, our method achieves an average improvement of more than 8 percentage points with an absolute advantage on CIFAR-100. Additionally, on Caltech-101 and CUBS-200-2011, our new approach provides comparable accuracy to baseline.","PeriodicalId":266168,"journal":{"name":"2023 8th International Conference on Computer and Communication Systems (ICCCS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 8th International Conference on Computer and Communication Systems (ICCCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCCS57501.2023.10151382","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Incremental learning has received significant attention, but the problem of catastrophic forgetting remains a major challenge for existing approaches. This issue hinders models from accumulating knowledge over long stretches. To address this problem, we propose a new approach called Attention Loss function based on Row-Sparsity (ALRS) that mines significant patches by simultaneously learning patch weights and logits (class vectors) using the same parameters. We integrate the attention mechanism with the novel loss function to avoid catastrophic forgetting. This innovative approach enables the model to conflate the newly introduced classes with the existing ones, without the need to store any data or models from the previous steps' base classes. To assess its efficacy, we incorporate ALRS into the distillation loss for validation and conduct a thorough evaluation of the approach's performance on three datasets: CIFAR-100, Caltech-101, and CUBS-200-2011. Compared to LWM, which also does not store data, our method achieves an average improvement of more than 8 percentage points with an absolute advantage on CIFAR-100. Additionally, on Caltech-101 and CUBS-200-2011, our new approach provides comparable accuracy to baseline.
基于行稀疏的增量学习注意损失函数
渐进式学习已经受到了极大的关注,但灾难性遗忘的问题仍然是现有方法面临的主要挑战。这个问题阻碍了模型长期积累知识。为了解决这个问题,我们提出了一种新的方法,称为基于行稀疏(ALRS)的注意力损失函数,该方法通过使用相同的参数同时学习补丁权重和logits(类向量)来挖掘重要的补丁。我们将注意机制与新的损失函数相结合,以避免灾难性遗忘。这种创新的方法使模型能够将新引入的类与现有的类合并在一起,而不需要存储来自前面步骤的基类的任何数据或模型。为了评估其有效性,我们将ALRS纳入蒸馏损失进行验证,并在三个数据集(CIFAR-100、Caltech-101和CUBS-200-2011)上对该方法的性能进行了全面评估。与同样不存储数据的LWM相比,我们的方法实现了超过8个百分点的平均改进,在CIFAR-100上具有绝对优势。此外,在Caltech-101和CUBS-200-2011上,我们的新方法提供了与基线相当的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信