用于旋转机械多传感器故障诊断的合作卷积神经网络框架

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Tianzhuang Yu;Zeyu Jiang;Zhaohui Ren;Yongchao Zhang;Shihua Zhou;Xin Zhou
{"title":"用于旋转机械多传感器故障诊断的合作卷积神经网络框架","authors":"Tianzhuang Yu;Zeyu Jiang;Zhaohui Ren;Yongchao Zhang;Shihua Zhou;Xin Zhou","doi":"10.1109/JSEN.2024.3468631","DOIUrl":null,"url":null,"abstract":"Multisensor data fusion techniques and advanced convolutional neural network (CNN) have contributed significantly to the development of intelligent fault diagnosis. However, few studies consider the information interactions between different sensor data, which limits the performance of diagnosis frameworks. This article introduces the novel convolution concept and the cross attention mechanism, proposing a cross attention fusion CNN (CAFCNN) diagnostic framework to improve the multisensor collaborative diagnostic technique. Specifically, a global correlation matrix is first developed to encode signals as images, highlighting the correlations between different points in the time-series data. Then, an attention mechanism called global spatial (GS) attention is proposed for extracting positional and spatial information in images. Finally, the developed interactive fusion module (IFM) utilizes cross attention to achieve information interaction of features from different sensors. The created gear dataset and the publicly available bearing dataset validate the effectiveness and generalization of the proposed methods. Moreover, the information interaction capability of CAFCNN is explained by visualizing the features.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 22","pages":"38309-38317"},"PeriodicalIF":4.3000,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Cooperative Convolutional Neural Network Framework for Multisensor Fault Diagnosis of Rotating Machinery\",\"authors\":\"Tianzhuang Yu;Zeyu Jiang;Zhaohui Ren;Yongchao Zhang;Shihua Zhou;Xin Zhou\",\"doi\":\"10.1109/JSEN.2024.3468631\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multisensor data fusion techniques and advanced convolutional neural network (CNN) have contributed significantly to the development of intelligent fault diagnosis. However, few studies consider the information interactions between different sensor data, which limits the performance of diagnosis frameworks. This article introduces the novel convolution concept and the cross attention mechanism, proposing a cross attention fusion CNN (CAFCNN) diagnostic framework to improve the multisensor collaborative diagnostic technique. Specifically, a global correlation matrix is first developed to encode signals as images, highlighting the correlations between different points in the time-series data. Then, an attention mechanism called global spatial (GS) attention is proposed for extracting positional and spatial information in images. Finally, the developed interactive fusion module (IFM) utilizes cross attention to achieve information interaction of features from different sensors. The created gear dataset and the publicly available bearing dataset validate the effectiveness and generalization of the proposed methods. Moreover, the information interaction capability of CAFCNN is explained by visualizing the features.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"24 22\",\"pages\":\"38309-38317\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-10-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10704564/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10704564/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

多传感器数据融合技术和先进的卷积神经网络(CNN)为智能故障诊断的发展做出了巨大贡献。然而,很少有研究考虑到不同传感器数据之间的信息交互,这限制了诊断框架的性能。本文介绍了新颖的卷积概念和交叉注意机制,提出了交叉注意融合 CNN(CAFCNN)诊断框架,以改进多传感器协同诊断技术。具体来说,首先开发了全局相关矩阵,将信号编码为图像,突出时间序列数据中不同点之间的相关性。然后,提出了一种名为全局空间(GS)注意力的关注机制,用于提取图像中的位置和空间信息。最后,开发的交互式融合模块(IFM)利用交叉注意力实现了不同传感器特征的信息交互。创建的齿轮数据集和公开的轴承数据集验证了所提方法的有效性和通用性。此外,还通过可视化特征解释了 CAFCNN 的信息交互能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Cooperative Convolutional Neural Network Framework for Multisensor Fault Diagnosis of Rotating Machinery
Multisensor data fusion techniques and advanced convolutional neural network (CNN) have contributed significantly to the development of intelligent fault diagnosis. However, few studies consider the information interactions between different sensor data, which limits the performance of diagnosis frameworks. This article introduces the novel convolution concept and the cross attention mechanism, proposing a cross attention fusion CNN (CAFCNN) diagnostic framework to improve the multisensor collaborative diagnostic technique. Specifically, a global correlation matrix is first developed to encode signals as images, highlighting the correlations between different points in the time-series data. Then, an attention mechanism called global spatial (GS) attention is proposed for extracting positional and spatial information in images. Finally, the developed interactive fusion module (IFM) utilizes cross attention to achieve information interaction of features from different sensors. The created gear dataset and the publicly available bearing dataset validate the effectiveness and generalization of the proposed methods. Moreover, the information interaction capability of CAFCNN is explained by visualizing the features.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信