Liquid-transformer temporal self-supervised network for few-shot class-incremental fault diagnosis of servo mechanisms

IF 5.6 2区 工程技术 Q1 ENGINEERING, MULTIDISCIPLINARY
Measurement Pub Date : 2026-05-05 Epub Date: 2026-03-07 DOI:10.1016/j.measurement.2026.121078
Zeming Zhang , Chuanyang Li , Changhua Hu , Jianhui Hu , Meng Zhao , Mingzhe Leng , Zhaoqiang Wang , Xinyi Wan , Ziyang Zheng
{"title":"Liquid-transformer temporal self-supervised network for few-shot class-incremental fault diagnosis of servo mechanisms","authors":"Zeming Zhang ,&nbsp;Chuanyang Li ,&nbsp;Changhua Hu ,&nbsp;Jianhui Hu ,&nbsp;Meng Zhao ,&nbsp;Mingzhe Leng ,&nbsp;Zhaoqiang Wang ,&nbsp;Xinyi Wan ,&nbsp;Ziyang Zheng","doi":"10.1016/j.measurement.2026.121078","DOIUrl":null,"url":null,"abstract":"<div><div>Electric servo mechanisms are critical actuators in aerospace equipment, where emerging fault types continuously appear under varying operating conditions. Few-shot class-incremental diagnosis is therefore constrained by two coupled challenges: highly nonstationary vibration responses and extremely limited labelled samples, both of which weaken temporal representation learning and aggravate catastrophic forgetting in conventional models. To address these issues, a Liquid-Transformer Temporal Self-Supervised Network (LTTSNet) is proposed for continual recognition of both base and emerging fault classes. Its backbone integrates a Liquid Neural Network (LNN) with a Lightweight Transformer (LTransformer), in which the LNN captures local transient dynamics through learnable time constants, whereas the LTransformer models long-range cross-cycle dependencies. In the base stage, a simple framework for contrastive learning of visual representations is employed to learn invariant representations from scarce unlabelled signals by contrasting augmented views. Pseudo-labels are generated via nearest-neighbour clustering under a cosine-similarity threshold and are jointly trained with labelled samples, thereby introducing latent new-class information before incremental updates. In the incremental stage, attention-weighted prototype estimation and one-step gradient prototype distillation are jointly employed to refine new-class prototypes. The backbone is kept frozen, and only the classifier head is updated, enabling rapid adaptation to new classes while preserving old-class discrimination. Experiments on a laboratory electric servo mechanism fault dataset and the Case Western Reserve University bearing dataset demonstrate that LTTSNet delivers significantly improves overall accuracy, new-class recognition, and forgetting suppression under cross-condition few-shot settings with both single and compound faults.</div></div>","PeriodicalId":18349,"journal":{"name":"Measurement","volume":"272 ","pages":"Article 121078"},"PeriodicalIF":5.6000,"publicationDate":"2026-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0263224126007876","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/3/7 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Electric servo mechanisms are critical actuators in aerospace equipment, where emerging fault types continuously appear under varying operating conditions. Few-shot class-incremental diagnosis is therefore constrained by two coupled challenges: highly nonstationary vibration responses and extremely limited labelled samples, both of which weaken temporal representation learning and aggravate catastrophic forgetting in conventional models. To address these issues, a Liquid-Transformer Temporal Self-Supervised Network (LTTSNet) is proposed for continual recognition of both base and emerging fault classes. Its backbone integrates a Liquid Neural Network (LNN) with a Lightweight Transformer (LTransformer), in which the LNN captures local transient dynamics through learnable time constants, whereas the LTransformer models long-range cross-cycle dependencies. In the base stage, a simple framework for contrastive learning of visual representations is employed to learn invariant representations from scarce unlabelled signals by contrasting augmented views. Pseudo-labels are generated via nearest-neighbour clustering under a cosine-similarity threshold and are jointly trained with labelled samples, thereby introducing latent new-class information before incremental updates. In the incremental stage, attention-weighted prototype estimation and one-step gradient prototype distillation are jointly employed to refine new-class prototypes. The backbone is kept frozen, and only the classifier head is updated, enabling rapid adaptation to new classes while preserving old-class discrimination. Experiments on a laboratory electric servo mechanism fault dataset and the Case Western Reserve University bearing dataset demonstrate that LTTSNet delivers significantly improves overall accuracy, new-class recognition, and forgetting suppression under cross-condition few-shot settings with both single and compound faults.
基于液变时间自监督网络的伺服机构小次类增量故障诊断
电动伺服机构是航空航天设备的关键执行机构,在不同的运行条件下不断出现新的故障类型。因此,少数次类增量诊断受到两个耦合挑战的限制:高度非平稳的振动响应和极其有限的标记样本,这两者都削弱了传统模型中的时间表征学习并加剧了灾难性遗忘。为了解决这些问题,提出了一种液变短时自监督网络(LTTSNet)来连续识别基本故障和新故障。它的主干集成了液体神经网络(LNN)和轻型变压器(LTransformer),其中LNN通过可学习的时间常数捕获局部瞬态动态,而LTransformer则建模长期交叉周期依赖关系。在基础阶段,采用一个简单的视觉表征对比学习框架,通过对比增强视图从稀缺的未标记信号中学习不变表征。伪标签通过余弦相似阈值下的最近邻聚类生成,并与标记样本联合训练,从而在增量更新之前引入潜在的新类别信息。在增量阶段,采用注意力加权原型估计和一步梯度原型蒸馏相结合的方法对新类别原型进行细化。骨干保持冻结,只有分类器头部更新,能够快速适应新类别,同时保留旧类别的区分。在实验室电动伺服机构故障数据集和凯斯西储大学轴承数据集上的实验表明,LTTSNet在单故障和复合故障的交叉条件少射设置下,显著提高了整体准确率、新类别识别和遗忘抑制。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Measurement
Measurement 工程技术-工程:综合
CiteScore
10.20
自引率
12.50%
发文量
1589
审稿时长
12.1 months
期刊介绍: Contributions are invited on novel achievements in all fields of measurement and instrumentation science and technology. Authors are encouraged to submit novel material, whose ultimate goal is an advancement in the state of the art of: measurement and metrology fundamentals, sensors, measurement instruments, measurement and estimation techniques, measurement data processing and fusion algorithms, evaluation procedures and methodologies for plants and industrial processes, performance analysis of systems, processes and algorithms, mathematical models for measurement-oriented purposes, distributed measurement systems in a connected world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书