非样例类增量哈希的边界感知原型增强和双层知识蒸馏

IF 7.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Qinghang Su , Dayan Wu , Bo Li
{"title":"非样例类增量哈希的边界感知原型增强和双层知识蒸馏","authors":"Qinghang Su ,&nbsp;Dayan Wu ,&nbsp;Bo Li","doi":"10.1016/j.knosys.2025.113520","DOIUrl":null,"url":null,"abstract":"<div><div>Deep hashing methods are extensively applied in image retrieval for their efficiency and low storage demands. Recently, deep incremental hashing methods have addressed the challenge of adapting to new classes in non-stationary environments, while ensuring compatibility with existing classes. However, most of these methods require old-class samples for joint training to resist catastrophic forgetting, which is not always feasible due to privacy concerns. This constraint underscores the need for Non-Exemplar Class-Incremental Hashing (NECIH) approaches, designed to retain knowledge without storing old-class samples. In NECIH methodologies, hash prototypes are commonly employed to maintain the discriminability of hash codes. However, these prototypes often fail to represent old-class distribution accurately, causing confusion between old and new classes. Furthermore, traditional instance-level knowledge distillation techniques are insufficient for efficiently transferring the structural information inherent in the feature space. To tackle these challenges, we introduce a novel deep incremental hashing approach called Boundary-aware <strong>P</strong>rototype <strong>A</strong>ugmentation and Dual-level Knowledge <strong>D</strong>istillation for NEC<strong>IH</strong> (PADIH). PADIH comprises three key components: the Prototype-based Code Learning (PCL) module, the Boundary-aware Prototype Augmentation (BPA) module, and the Dual-level Knowledge Distillation (DKD) module. Specifically, the PCL module learns discriminative hash codes for new classes, while the BPA module augments the old-class prototypes into pseudo codes, with an emphasis on the distribution boundaries. Moreover, the DKD module integrates both instance-level and relation-level knowledge distillation to facilitate the transfer of comprehensive information between models. Extensive experiments conducted on four benchmarks across twelve incremental learning situations demonstrate the superior performance of PADIH.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"318 ","pages":"Article 113520"},"PeriodicalIF":7.2000,"publicationDate":"2025-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Boundary-aware Prototype Augmentation and Dual-level Knowledge Distillation for Non-Exemplar Class-Incremental Hashing\",\"authors\":\"Qinghang Su ,&nbsp;Dayan Wu ,&nbsp;Bo Li\",\"doi\":\"10.1016/j.knosys.2025.113520\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Deep hashing methods are extensively applied in image retrieval for their efficiency and low storage demands. Recently, deep incremental hashing methods have addressed the challenge of adapting to new classes in non-stationary environments, while ensuring compatibility with existing classes. However, most of these methods require old-class samples for joint training to resist catastrophic forgetting, which is not always feasible due to privacy concerns. This constraint underscores the need for Non-Exemplar Class-Incremental Hashing (NECIH) approaches, designed to retain knowledge without storing old-class samples. In NECIH methodologies, hash prototypes are commonly employed to maintain the discriminability of hash codes. However, these prototypes often fail to represent old-class distribution accurately, causing confusion between old and new classes. Furthermore, traditional instance-level knowledge distillation techniques are insufficient for efficiently transferring the structural information inherent in the feature space. To tackle these challenges, we introduce a novel deep incremental hashing approach called Boundary-aware <strong>P</strong>rototype <strong>A</strong>ugmentation and Dual-level Knowledge <strong>D</strong>istillation for NEC<strong>IH</strong> (PADIH). PADIH comprises three key components: the Prototype-based Code Learning (PCL) module, the Boundary-aware Prototype Augmentation (BPA) module, and the Dual-level Knowledge Distillation (DKD) module. Specifically, the PCL module learns discriminative hash codes for new classes, while the BPA module augments the old-class prototypes into pseudo codes, with an emphasis on the distribution boundaries. Moreover, the DKD module integrates both instance-level and relation-level knowledge distillation to facilitate the transfer of comprehensive information between models. Extensive experiments conducted on four benchmarks across twelve incremental learning situations demonstrate the superior performance of PADIH.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"318 \",\"pages\":\"Article 113520\"},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2025-04-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125005660\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125005660","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

深度哈希方法以其效率高、存储空间小等优点在图像检索中得到了广泛的应用。最近,深度增量哈希方法解决了在非平稳环境中适应新类的挑战,同时确保与现有类的兼容性。然而,这些方法中的大多数都需要旧类样本进行联合训练,以防止灾难性遗忘,由于隐私问题,这并不总是可行的。这种约束强调了对非范例类增量哈希(NECIH)方法的需求,该方法旨在保留知识而不存储旧类样本。在NECIH方法中,哈希原型通常用于维护哈希码的可辨别性。然而,这些原型常常不能准确地表示旧类的分布,从而导致新旧类之间的混淆。此外,传统的实例级知识蒸馏技术不足以有效地传递特征空间中固有的结构信息。为了解决这些挑战,我们引入了一种新的深度增量哈希方法,称为边界感知原型增强和双层知识蒸馏(PADIH)。PADIH包括三个关键组件:基于原型的代码学习(PCL)模块、边界感知原型增强(BPA)模块和双级知识蒸馏(DKD)模块。具体来说,PCL模块学习新类的判别散列码,而BPA模块将旧类原型扩展为伪代码,并强调分布边界。此外,DKD模块集成了实例级和关系级知识精馏,以促进模型之间综合信息的传递。在12种增量学习情境的4个基准上进行的大量实验证明了PADIH的优越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Boundary-aware Prototype Augmentation and Dual-level Knowledge Distillation for Non-Exemplar Class-Incremental Hashing
Deep hashing methods are extensively applied in image retrieval for their efficiency and low storage demands. Recently, deep incremental hashing methods have addressed the challenge of adapting to new classes in non-stationary environments, while ensuring compatibility with existing classes. However, most of these methods require old-class samples for joint training to resist catastrophic forgetting, which is not always feasible due to privacy concerns. This constraint underscores the need for Non-Exemplar Class-Incremental Hashing (NECIH) approaches, designed to retain knowledge without storing old-class samples. In NECIH methodologies, hash prototypes are commonly employed to maintain the discriminability of hash codes. However, these prototypes often fail to represent old-class distribution accurately, causing confusion between old and new classes. Furthermore, traditional instance-level knowledge distillation techniques are insufficient for efficiently transferring the structural information inherent in the feature space. To tackle these challenges, we introduce a novel deep incremental hashing approach called Boundary-aware Prototype Augmentation and Dual-level Knowledge Distillation for NECIH (PADIH). PADIH comprises three key components: the Prototype-based Code Learning (PCL) module, the Boundary-aware Prototype Augmentation (BPA) module, and the Dual-level Knowledge Distillation (DKD) module. Specifically, the PCL module learns discriminative hash codes for new classes, while the BPA module augments the old-class prototypes into pseudo codes, with an emphasis on the distribution boundaries. Moreover, the DKD module integrates both instance-level and relation-level knowledge distillation to facilitate the transfer of comprehensive information between models. Extensive experiments conducted on four benchmarks across twelve incremental learning situations demonstrate the superior performance of PADIH.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信