{"title":"A Lightweight Transformer Edge Intelligence Model for RUL Prediction Classification.","authors":"Lilu Wang, Yongqi Li, Haiyuan Liu, Taihui Liu","doi":"10.3390/s25134224","DOIUrl":null,"url":null,"abstract":"<p><p>Remaining Useful Life (RUL) prediction is a crucial task in predictive maintenance. Currently, gated recurrent networks, hybrid models, and attention-enhanced models used for predictive maintenance face the challenge of balancing prediction accuracy and model lightweighting when extracting complex degradation features. This limitation hinders their deployment on resource-constrained edge devices. To address this issue, we propose TBiGNet, a lightweight Transformer-based classification network model for RUL prediction. TBiGNet features an encoder-decoder architecture that outperforms traditional Transformer models by achieving over 15% higher accuracy while reducing computational load, memory access, and parameter size by more than 98%. In the encoder, we optimize the attention mechanism by integrating the individual linear mappings of queries, keys, and values into an efficient operation, reducing memory access overhead by 60%. Additionally, an adaptive feature pruning module is introduced to dynamically select critical features based on their importance, reducing redundancy and enhancing model accuracy by 6%. The decoder innovatively fuses two different types of features and leverages BiGRU to compensate for the limitations of the attention mechanism in capturing degradation features, resulting in a 7% accuracy improvement. Extensive experiments on the C-MAPSS dataset demonstrate that TBiGNet surpasses existing methods in terms of computational accuracy, model size, and memory access, showcasing significant technical advantages and application potential. Experiments on the C-MPASS dataset show that TBiGNet is superior to the existing methods in terms of calculation accuracy, model size and throughput, showing significant technical advantages and application potential.</p>","PeriodicalId":21698,"journal":{"name":"Sensors","volume":"25 13","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12252478/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3390/s25134224","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Remaining Useful Life (RUL) prediction is a crucial task in predictive maintenance. Currently, gated recurrent networks, hybrid models, and attention-enhanced models used for predictive maintenance face the challenge of balancing prediction accuracy and model lightweighting when extracting complex degradation features. This limitation hinders their deployment on resource-constrained edge devices. To address this issue, we propose TBiGNet, a lightweight Transformer-based classification network model for RUL prediction. TBiGNet features an encoder-decoder architecture that outperforms traditional Transformer models by achieving over 15% higher accuracy while reducing computational load, memory access, and parameter size by more than 98%. In the encoder, we optimize the attention mechanism by integrating the individual linear mappings of queries, keys, and values into an efficient operation, reducing memory access overhead by 60%. Additionally, an adaptive feature pruning module is introduced to dynamically select critical features based on their importance, reducing redundancy and enhancing model accuracy by 6%. The decoder innovatively fuses two different types of features and leverages BiGRU to compensate for the limitations of the attention mechanism in capturing degradation features, resulting in a 7% accuracy improvement. Extensive experiments on the C-MAPSS dataset demonstrate that TBiGNet surpasses existing methods in terms of computational accuracy, model size, and memory access, showcasing significant technical advantages and application potential. Experiments on the C-MPASS dataset show that TBiGNet is superior to the existing methods in terms of calculation accuracy, model size and throughput, showing significant technical advantages and application potential.
期刊介绍:
Sensors (ISSN 1424-8220) provides an advanced forum for the science and technology of sensors and biosensors. It publishes reviews (including comprehensive reviews on the complete sensors products), regular research papers and short notes. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.