{"title":"Multi-Domain Time-Frequency Fusion Feature Contrastive Learning for Machinery Fault Diagnosis","authors":"Yang Wei;Kai Wang","doi":"10.1109/LSP.2025.3548466","DOIUrl":null,"url":null,"abstract":"The scarcity of a large amount of labeled data for adequately training of deep learning models, along with their restricted generalization capabilities, persistently hinders the real-world practical application of data-driven deep learning in few-shot fault diagnosis and transfer task fault diagnosis. This paper proposes a self-supervised Wide Kernel Time-Frequency Fusion (WTFF) contrastive learning method that leverages extensive unlabeled signals to extract discriminative time-frequency fusion features, thereby enhancing fault diagnosis performance even with a limited number of labeled samples. Moreover, the WTFF integrates a multi-layer time-frequency wide convolutional neural network (TFCNN) encoder with a novel local and global time-frequency contrastive loss (LGTFCL) to capture time frequency consistency by facilitating the alignment of time-domain and frequency-domain feature embeddings across the shallow and deep network layers. In the fine-tuning phase, time frequency features across various levels learned from transferred pretrained model are fused to extract signal characteristics that exhibit both time and frequency discrimination. The proposed method demonstrates superior diagnostic accuracy and robustness in experiments involving few-shot and transfer learning-based fault diagnosis.","PeriodicalId":13154,"journal":{"name":"IEEE Signal Processing Letters","volume":"32 ","pages":"1116-1120"},"PeriodicalIF":3.2000,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Letters","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10910177/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The scarcity of a large amount of labeled data for adequately training of deep learning models, along with their restricted generalization capabilities, persistently hinders the real-world practical application of data-driven deep learning in few-shot fault diagnosis and transfer task fault diagnosis. This paper proposes a self-supervised Wide Kernel Time-Frequency Fusion (WTFF) contrastive learning method that leverages extensive unlabeled signals to extract discriminative time-frequency fusion features, thereby enhancing fault diagnosis performance even with a limited number of labeled samples. Moreover, the WTFF integrates a multi-layer time-frequency wide convolutional neural network (TFCNN) encoder with a novel local and global time-frequency contrastive loss (LGTFCL) to capture time frequency consistency by facilitating the alignment of time-domain and frequency-domain feature embeddings across the shallow and deep network layers. In the fine-tuning phase, time frequency features across various levels learned from transferred pretrained model are fused to extract signal characteristics that exhibit both time and frequency discrimination. The proposed method demonstrates superior diagnostic accuracy and robustness in experiments involving few-shot and transfer learning-based fault diagnosis.
期刊介绍:
The IEEE Signal Processing Letters is a monthly, archival publication designed to provide rapid dissemination of original, cutting-edge ideas and timely, significant contributions in signal, image, speech, language and audio processing. Papers published in the Letters can be presented within one year of their appearance in signal processing conferences such as ICASSP, GlobalSIP and ICIP, and also in several workshop organized by the Signal Processing Society.