Fault Detection Method based on Local and Global Attention Mechanisms

Lu Jie
{"title":"Fault Detection Method based on Local and Global Attention Mechanisms","authors":"Lu Jie","doi":"10.36347/sjet.2023.v11i08.004","DOIUrl":null,"url":null,"abstract":"In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.","PeriodicalId":379926,"journal":{"name":"Scholars Journal of Engineering and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scholars Journal of Engineering and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.36347/sjet.2023.v11i08.004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.
基于局部和全局关注机制的故障检测方法
近年来,由于分布式控制系统的广泛应用,可以收集和存储大量的生产过程数据,为基于深度学习的过程监控技术提供了坚实的数据基础。Transformer模型是一个完全连接的注意力机制模型,它通过计算任意两个项之间的相关性来捕获数据的全局依赖性。本文提出了一种基于局部和全局注意机制的Transformer模型。首先,在对数据进行标准化处理,消除不同维度的影响后,使用位置编码对位置信息进行标记。然后,从特征维度上将数据分成相等的两部分。一部分进入标准注意机制捕获序列的全局信息,另一部分进入局部注意机制捕获序列的局部信息,然后将捕获的局部信息和全局信息融合,以降低计算复杂度,弥补Transformer模型在捕获局部信息方面的不足。将本文提出的模型应用于青霉素发酵过程的故障检测,实验验证了该模型与标准Transformer模型相比具有更高的故障检测精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信