Dynamic branch layer fusion: A new continual learning method for rotating machinery fault diagnosis

IF 7.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Changqing Shen , Zhenzhong He , Bojian Chen , Weiguo Huang , Lin Li , Dong Wang
{"title":"Dynamic branch layer fusion: A new continual learning method for rotating machinery fault diagnosis","authors":"Changqing Shen ,&nbsp;Zhenzhong He ,&nbsp;Bojian Chen ,&nbsp;Weiguo Huang ,&nbsp;Lin Li ,&nbsp;Dong Wang","doi":"10.1016/j.knosys.2025.113177","DOIUrl":null,"url":null,"abstract":"<div><div>In real-world environments, the critical components of rotating machinery often encounter various new fault types because of complex operating conditions. The replay-based continual learning method in fault diagnosis mitigates catastrophic forgetting associated with the introduction of previous fault samples. However, the retention of previous samples during the training of new tasks creates an imbalance in the distribution of dataset and limits the mitigation of catastrophic forgetting. A new continual learning method based on dynamic branch layer fusion is proposed and applied to the diagnosis scenarios with imbalanced dataset. In particular, the proposed method builds a branch layer for each old task to retain the old knowledge upon the arrival of a new task, then the branch layers fusion structure is designed to solve the problem of model growth. Additionally, a two-stage training process encompassing model adaptation and fusion is proposed. On this basis, integration loss is used to optimize the learning of models for all types across tasks. Finally, the assembly of the old and new models is achieved through distillation loss, enhancing the reliability of models on all tasks. Experimental results indicate that the catastrophic forgetting problem prevalent in imbalanced dataset can be effectively alleviated by the proposed method.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"313 ","pages":"Article 113177"},"PeriodicalIF":7.2000,"publicationDate":"2025-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125002242","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In real-world environments, the critical components of rotating machinery often encounter various new fault types because of complex operating conditions. The replay-based continual learning method in fault diagnosis mitigates catastrophic forgetting associated with the introduction of previous fault samples. However, the retention of previous samples during the training of new tasks creates an imbalance in the distribution of dataset and limits the mitigation of catastrophic forgetting. A new continual learning method based on dynamic branch layer fusion is proposed and applied to the diagnosis scenarios with imbalanced dataset. In particular, the proposed method builds a branch layer for each old task to retain the old knowledge upon the arrival of a new task, then the branch layers fusion structure is designed to solve the problem of model growth. Additionally, a two-stage training process encompassing model adaptation and fusion is proposed. On this basis, integration loss is used to optimize the learning of models for all types across tasks. Finally, the assembly of the old and new models is achieved through distillation loss, enhancing the reliability of models on all tasks. Experimental results indicate that the catastrophic forgetting problem prevalent in imbalanced dataset can be effectively alleviated by the proposed method.
在实际环境中,旋转机械的关键部件经常会因为复杂的运行条件而遇到各种新的故障类型。在故障诊断中,基于重放的持续学习方法可以减轻因引入先前故障样本而造成的灾难性遗忘。然而,在训练新任务时保留以前的样本会造成数据集分布的不平衡,从而限制了灾难性遗忘的缓解。本文提出了一种基于动态分支层融合的新型持续学习方法,并将其应用于数据集不平衡的诊断场景。具体而言,该方法为每个旧任务建立一个分支层,以便在新任务到来时保留旧知识,然后设计分支层融合结构来解决模型增长问题。此外,还提出了一个包含模型适应和融合的两阶段训练过程。在此基础上,利用整合损失来优化各种类型任务的模型学习。最后,通过蒸馏损失实现新旧模型的组装,从而提高模型在所有任务中的可靠性。实验结果表明,所提出的方法可以有效缓解不平衡数据集中普遍存在的灾难性遗忘问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信