{"title":"A Class-Added Continual Learning Method for Motor Fault Diagnosis Based on Knowledge Distillation of Representation Proximity Behavior","authors":"Ao Ding, Yong Qin, Biao Wang, L. Jia","doi":"10.1109/ICPHM57936.2023.10193966","DOIUrl":null,"url":null,"abstract":"Continual learning is promising in intelligent motor fault diagnosis because it enables networks to increase diagnosable fault classes without time-consuming retraining during new fault happening. However, the traditional continual learning based on knowledge distillation keeps the absolute positions of samples in representation spaces to prevent catastrophic forgetting, which limits new fault samples to embedding into representation spaces flexibly. To address this issue, a continual learning method based on a novel knowledge distillation strategy is proposed for motor fault diagnosis. At incremental stages of continual learning, new and old diagnosis networks are first regarded as the teacher and student networks. Then, the improved distillation strategy is designed to guide knowledge transfer from teacher networks to student networks, meanwhile, student networks learn from the new fault samples. Finally, new diagnosis networks are obtained which can diagnose incremental fault classes. For the improved knowledge distillation strategy, knowledge is inherited by maintaining the proximity behavior of samples in the representation spaces, thereby networks can learn to map samples into representation spaces more flexibly. Through a study case of class-added fault diagnosis of motors, it is proved that the proposed method can improve diagnostic accuracy during continual learning.","PeriodicalId":169274,"journal":{"name":"2023 IEEE International Conference on Prognostics and Health Management (ICPHM)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Prognostics and Health Management (ICPHM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPHM57936.2023.10193966","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Continual learning is promising in intelligent motor fault diagnosis because it enables networks to increase diagnosable fault classes without time-consuming retraining during new fault happening. However, the traditional continual learning based on knowledge distillation keeps the absolute positions of samples in representation spaces to prevent catastrophic forgetting, which limits new fault samples to embedding into representation spaces flexibly. To address this issue, a continual learning method based on a novel knowledge distillation strategy is proposed for motor fault diagnosis. At incremental stages of continual learning, new and old diagnosis networks are first regarded as the teacher and student networks. Then, the improved distillation strategy is designed to guide knowledge transfer from teacher networks to student networks, meanwhile, student networks learn from the new fault samples. Finally, new diagnosis networks are obtained which can diagnose incremental fault classes. For the improved knowledge distillation strategy, knowledge is inherited by maintaining the proximity behavior of samples in the representation spaces, thereby networks can learn to map samples into representation spaces more flexibly. Through a study case of class-added fault diagnosis of motors, it is proved that the proposed method can improve diagnostic accuracy during continual learning.