{"title":"基于层次知识蒸馏的多任务负载识别与信号去噪","authors":"Jiahao Jiang;Zhelong Wang;Sen Qiu;Xiang Li;Chenming Zhang","doi":"10.1109/TNSE.2025.3542409","DOIUrl":null,"url":null,"abstract":"Complex neural networks with deep structures are beneficial for solving problems such as load classification in Non-intrusive load monitoring (NILM) due to their powerful feature extraction capabilities. Unfortunately, corresponding complex models designed based on deep learning algorithms require high computational and memory resources. Additionally, the external noise interference during practical load identification poses a challenge. To solve these difficulties with practical industrial significance, this paper proposes a multi-task-knowledge distillation (MTL-KD) framework for NILM. The main contributions within this framework include a new feature extraction method that combines variational mode extraction (VME) and mutual information (MI) to extract unique features and filter out noise interference, an attention-based MTL model to simultaneously perform the load identification and signal de-noising tasks, and new KD modules to transfer knowledge from a complex teacher model to a small student model. Experimental evaluations conducted on public datasets such as the plug-load appliance identification dataset (PLAID) and the worldwide household and industry transient energy dataset (WHITED), as well as a private load dataset collected in the lab, demonstrate that the proposed MTL-KD framework surpasses state-of-the-art approaches.","PeriodicalId":54229,"journal":{"name":"IEEE Transactions on Network Science and Engineering","volume":"12 3","pages":"1967-1980"},"PeriodicalIF":6.7000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-Task Load Identification and Signal Denoising via Hierarchical Knowledge Distillation\",\"authors\":\"Jiahao Jiang;Zhelong Wang;Sen Qiu;Xiang Li;Chenming Zhang\",\"doi\":\"10.1109/TNSE.2025.3542409\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Complex neural networks with deep structures are beneficial for solving problems such as load classification in Non-intrusive load monitoring (NILM) due to their powerful feature extraction capabilities. Unfortunately, corresponding complex models designed based on deep learning algorithms require high computational and memory resources. Additionally, the external noise interference during practical load identification poses a challenge. To solve these difficulties with practical industrial significance, this paper proposes a multi-task-knowledge distillation (MTL-KD) framework for NILM. The main contributions within this framework include a new feature extraction method that combines variational mode extraction (VME) and mutual information (MI) to extract unique features and filter out noise interference, an attention-based MTL model to simultaneously perform the load identification and signal de-noising tasks, and new KD modules to transfer knowledge from a complex teacher model to a small student model. Experimental evaluations conducted on public datasets such as the plug-load appliance identification dataset (PLAID) and the worldwide household and industry transient energy dataset (WHITED), as well as a private load dataset collected in the lab, demonstrate that the proposed MTL-KD framework surpasses state-of-the-art approaches.\",\"PeriodicalId\":54229,\"journal\":{\"name\":\"IEEE Transactions on Network Science and Engineering\",\"volume\":\"12 3\",\"pages\":\"1967-1980\"},\"PeriodicalIF\":6.7000,\"publicationDate\":\"2025-02-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Network Science and Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10891007/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Network Science and Engineering","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10891007/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
Multi-Task Load Identification and Signal Denoising via Hierarchical Knowledge Distillation
Complex neural networks with deep structures are beneficial for solving problems such as load classification in Non-intrusive load monitoring (NILM) due to their powerful feature extraction capabilities. Unfortunately, corresponding complex models designed based on deep learning algorithms require high computational and memory resources. Additionally, the external noise interference during practical load identification poses a challenge. To solve these difficulties with practical industrial significance, this paper proposes a multi-task-knowledge distillation (MTL-KD) framework for NILM. The main contributions within this framework include a new feature extraction method that combines variational mode extraction (VME) and mutual information (MI) to extract unique features and filter out noise interference, an attention-based MTL model to simultaneously perform the load identification and signal de-noising tasks, and new KD modules to transfer knowledge from a complex teacher model to a small student model. Experimental evaluations conducted on public datasets such as the plug-load appliance identification dataset (PLAID) and the worldwide household and industry transient energy dataset (WHITED), as well as a private load dataset collected in the lab, demonstrate that the proposed MTL-KD framework surpasses state-of-the-art approaches.
期刊介绍:
The proposed journal, called the IEEE Transactions on Network Science and Engineering (TNSE), is committed to timely publishing of peer-reviewed technical articles that deal with the theory and applications of network science and the interconnections among the elements in a system that form a network. In particular, the IEEE Transactions on Network Science and Engineering publishes articles on understanding, prediction, and control of structures and behaviors of networks at the fundamental level. The types of networks covered include physical or engineered networks, information networks, biological networks, semantic networks, economic networks, social networks, and ecological networks. Aimed at discovering common principles that govern network structures, network functionalities and behaviors of networks, the journal seeks articles on understanding, prediction, and control of structures and behaviors of networks. Another trans-disciplinary focus of the IEEE Transactions on Network Science and Engineering is the interactions between and co-evolution of different genres of networks.