{"title":"A pruning-aware dynamic slimmable network using meta-gradients for high-speed train bogie bearing fault diagnosis.","authors":"Jingsong Xie, Sha Cao, Tongyang Pan, Tiantian Wang, Jinsong Yang, Jinglong Chen","doi":"10.1016/j.isatra.2025.02.031","DOIUrl":null,"url":null,"abstract":"<p><p>Although intelligent fault diagnosis achieves remarkable achievements, computation efficiency is a commonly ignored problem in existing studies. Pruning network networks enable us to find compact models that not only retain the diagnosis accuracy, but also consume fewer computation resources for training and inference. However, current studies are inefficient in building a saliency criterion for parameter importance evaluation. In this paper, we identify a pruning-aware dynamic slimmable network which uses the meta-gradients to select unnecessary parameters to prune. The slimmable network is designed with two sub-networks, called the classifier and the evaluator to generate meta-gradients for parameter pruning. And an iterative pruning algorithm is proposed to improve computation efficiency while retaining diagnosis performance. Our method is verified on a high-precision bogie fault simulation experimental data set and achieves state-of-art performance in terms of accuracy and efficiency compared with existing studies.</p>","PeriodicalId":94059,"journal":{"name":"ISA transactions","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISA transactions","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.isatra.2025.02.031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Although intelligent fault diagnosis achieves remarkable achievements, computation efficiency is a commonly ignored problem in existing studies. Pruning network networks enable us to find compact models that not only retain the diagnosis accuracy, but also consume fewer computation resources for training and inference. However, current studies are inefficient in building a saliency criterion for parameter importance evaluation. In this paper, we identify a pruning-aware dynamic slimmable network which uses the meta-gradients to select unnecessary parameters to prune. The slimmable network is designed with two sub-networks, called the classifier and the evaluator to generate meta-gradients for parameter pruning. And an iterative pruning algorithm is proposed to improve computation efficiency while retaining diagnosis performance. Our method is verified on a high-precision bogie fault simulation experimental data set and achieves state-of-art performance in terms of accuracy and efficiency compared with existing studies.