没有灾难性遗忘的轻量级类增量语义分割。

IF 13.7
Wei Cong;Yang Cong;Yu Ren
{"title":"没有灾难性遗忘的轻量级类增量语义分割。","authors":"Wei Cong;Yang Cong;Yu Ren","doi":"10.1109/TIP.2025.3588065","DOIUrl":null,"url":null,"abstract":"Class incremental semantic segmentation (CISS) aims to progressively segment newly introduced classes while preserving the memory of previously learned ones. Traditional CISS methods directly employ advanced semantic segmentation models (e.g., Deeplab-v3) as continual learners. However, these methods require substantial computational and memory resources, limiting their deployment on edge devices. In this paper, we propose a Lightweight Class Incremental Semantic Segmentation (LISS) model tailored for resource-constrained scenarios. Specifically, we design an automatic knowledge-preservation pruning strategy based on the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which automatically compresses the CISS model by searching for global penalty coefficients. Nonetheless, reducing model parameters exacerbates catastrophic forgetting during incremental learning. To mitigate this challenge, we develop a clustering-based pseudo labels generator to obtain high-quality pseudo labels by considering the feature space structure of old classes. It adjusts predicted probabilities from the old model according to the feature proximity to nearest sub-cluster centers for each class. Additionally, we introduce a customized soft labels module that distills the semantic relationships between classes separately. It decomposes soft labels into target probabilities, background probabilities, and other probabilities, thereby maintaining knowledge of previously learned classes in a fine-grained manner. Extensive experiments on two benchmark datasets demonstrate that our LISS model outperforms state-of-the-art approaches in both effectiveness and efficiency.","PeriodicalId":94032,"journal":{"name":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","volume":"34 ","pages":"4566-4579"},"PeriodicalIF":13.7000,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Lightweight Class Incremental Semantic Segmentation Without Catastrophic Forgetting\",\"authors\":\"Wei Cong;Yang Cong;Yu Ren\",\"doi\":\"10.1109/TIP.2025.3588065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Class incremental semantic segmentation (CISS) aims to progressively segment newly introduced classes while preserving the memory of previously learned ones. Traditional CISS methods directly employ advanced semantic segmentation models (e.g., Deeplab-v3) as continual learners. However, these methods require substantial computational and memory resources, limiting their deployment on edge devices. In this paper, we propose a Lightweight Class Incremental Semantic Segmentation (LISS) model tailored for resource-constrained scenarios. Specifically, we design an automatic knowledge-preservation pruning strategy based on the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which automatically compresses the CISS model by searching for global penalty coefficients. Nonetheless, reducing model parameters exacerbates catastrophic forgetting during incremental learning. To mitigate this challenge, we develop a clustering-based pseudo labels generator to obtain high-quality pseudo labels by considering the feature space structure of old classes. It adjusts predicted probabilities from the old model according to the feature proximity to nearest sub-cluster centers for each class. Additionally, we introduce a customized soft labels module that distills the semantic relationships between classes separately. It decomposes soft labels into target probabilities, background probabilities, and other probabilities, thereby maintaining knowledge of previously learned classes in a fine-grained manner. Extensive experiments on two benchmark datasets demonstrate that our LISS model outperforms state-of-the-art approaches in both effectiveness and efficiency.\",\"PeriodicalId\":94032,\"journal\":{\"name\":\"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society\",\"volume\":\"34 \",\"pages\":\"4566-4579\"},\"PeriodicalIF\":13.7000,\"publicationDate\":\"2025-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11082484/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11082484/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

类增量语义切分(CISS)的目的是在保留已学习类的记忆的同时,逐步分割新引入的类。传统的CISS方法直接采用高级语义分割模型(如Deeplab-v3)作为连续学习器。然而,这些方法需要大量的计算和内存资源,限制了它们在边缘设备上的部署。在本文中,我们提出了一种针对资源受限场景量身定制的轻量级类增量语义分割(LISS)模型。具体而言,我们设计了一种基于Hilbert-Schmidt独立准则(HSIC) Lasso的知识保存自动剪枝策略,该策略通过搜索全局惩罚系数来自动压缩CISS模型。然而,减少模型参数会加剧增量学习过程中的灾难性遗忘。为了缓解这一挑战,我们开发了一个基于聚类的伪标签生成器,通过考虑旧类的特征空间结构来获得高质量的伪标签。它根据每个类别与最近子簇中心的特征接近程度调整旧模型的预测概率。此外,我们还引入了一个定制的软标签模块,分别提取类之间的语义关系。它将软标签分解为目标概率、背景概率和其他概率,从而以细粒度的方式维护以前学习过的类的知识。在两个基准数据集上进行的大量实验表明,我们的LISS模型在有效性和效率方面都优于最先进的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Lightweight Class Incremental Semantic Segmentation Without Catastrophic Forgetting
Class incremental semantic segmentation (CISS) aims to progressively segment newly introduced classes while preserving the memory of previously learned ones. Traditional CISS methods directly employ advanced semantic segmentation models (e.g., Deeplab-v3) as continual learners. However, these methods require substantial computational and memory resources, limiting their deployment on edge devices. In this paper, we propose a Lightweight Class Incremental Semantic Segmentation (LISS) model tailored for resource-constrained scenarios. Specifically, we design an automatic knowledge-preservation pruning strategy based on the Hilbert-Schmidt Independence Criterion (HSIC) Lasso, which automatically compresses the CISS model by searching for global penalty coefficients. Nonetheless, reducing model parameters exacerbates catastrophic forgetting during incremental learning. To mitigate this challenge, we develop a clustering-based pseudo labels generator to obtain high-quality pseudo labels by considering the feature space structure of old classes. It adjusts predicted probabilities from the old model according to the feature proximity to nearest sub-cluster centers for each class. Additionally, we introduce a customized soft labels module that distills the semantic relationships between classes separately. It decomposes soft labels into target probabilities, background probabilities, and other probabilities, thereby maintaining knowledge of previously learned classes in a fine-grained manner. Extensive experiments on two benchmark datasets demonstrate that our LISS model outperforms state-of-the-art approaches in both effectiveness and efficiency.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信