{"title":"A Novel Deep Generative Model via Semantic-Based Knowledge Distillation for Zero-Shot Learning","authors":"Xianglin Bao;Xiaofeng Xu;Ruiheng Zhang;Lei Zhu","doi":"10.1109/LSP.2025.3585822","DOIUrl":null,"url":null,"abstract":"Zero-Shot Learning (ZSL) aims to identify unseen target classes that lack training data. Most existing methods address the ZSL problem by generating samples of unseen classes based on the training data of seen classes and the semantic representations of unseen classes. However, due to the inherent limitations of ZSL, the generated unseen samples tend to be biased towards the data of seen classes, resulting in a label shift problem in the model’s projection domain. To address these issues, we propose a novel generation-based ZSL approach that incorporates semantic-based constraints and knowledge distillation. Specifically, the semantic regularization and preservation constraints are designed to improve the distribution and discriminability of the generated unseen data, respectively. Furthermore, the semantic-based knowledge distillation strategy is introduced to enhance the generative model’s feature encoding ability, thereby improving the quality of the generated unseen data. Extensive experiments on two standard ZSL benchmark datasets demonstrate that the proposed model achieves superior performance on both traditional and generalized ZSL tasks.","PeriodicalId":13154,"journal":{"name":"IEEE Signal Processing Letters","volume":"32 ","pages":"2704-2708"},"PeriodicalIF":3.2000,"publicationDate":"2025-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Letters","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11068117/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Zero-Shot Learning (ZSL) aims to identify unseen target classes that lack training data. Most existing methods address the ZSL problem by generating samples of unseen classes based on the training data of seen classes and the semantic representations of unseen classes. However, due to the inherent limitations of ZSL, the generated unseen samples tend to be biased towards the data of seen classes, resulting in a label shift problem in the model’s projection domain. To address these issues, we propose a novel generation-based ZSL approach that incorporates semantic-based constraints and knowledge distillation. Specifically, the semantic regularization and preservation constraints are designed to improve the distribution and discriminability of the generated unseen data, respectively. Furthermore, the semantic-based knowledge distillation strategy is introduced to enhance the generative model’s feature encoding ability, thereby improving the quality of the generated unseen data. Extensive experiments on two standard ZSL benchmark datasets demonstrate that the proposed model achieves superior performance on both traditional and generalized ZSL tasks.
期刊介绍:
The IEEE Signal Processing Letters is a monthly, archival publication designed to provide rapid dissemination of original, cutting-edge ideas and timely, significant contributions in signal, image, speech, language and audio processing. Papers published in the Letters can be presented within one year of their appearance in signal processing conferences such as ICASSP, GlobalSIP and ICIP, and also in several workshop organized by the Signal Processing Society.