{"title":"基于局部和全局关注机制的故障检测方法","authors":"Lu Jie","doi":"10.36347/sjet.2023.v11i08.004","DOIUrl":null,"url":null,"abstract":"In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.","PeriodicalId":379926,"journal":{"name":"Scholars Journal of Engineering and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fault Detection Method based on Local and Global Attention Mechanisms\",\"authors\":\"Lu Jie\",\"doi\":\"10.36347/sjet.2023.v11i08.004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.\",\"PeriodicalId\":379926,\"journal\":{\"name\":\"Scholars Journal of Engineering and Technology\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-08-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Scholars Journal of Engineering and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.36347/sjet.2023.v11i08.004\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scholars Journal of Engineering and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.36347/sjet.2023.v11i08.004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fault Detection Method based on Local and Global Attention Mechanisms
In recent years, due to the wide application of Distributed control system, a large number of production process data can be collected and stored, which provides a solid data foundation for process monitoring technology based on deep learning. The Transformer model is a fully connected attention mechanism model that captures the global dependencies of data by calculating the correlation between any two items. This paper proposes a Transformer model based on local and global attention mechanisms. Firstly, after the data is standardized to eliminate the impact of different dimensions, positional encoding is used to mark the position information. Then, the data is divided into two equal parts from the feature dimension. One part enters the standard attention mechanism to capture the global information of the sequence, and the other part enters the local attention mechanism to capture the local information of the sequence, Then, the captured local information and global information are fused to reduce computational complexity and compensate for the shortcomings of the Transformer model in capturing local information. By applying the model proposed in this paper to the penicillin fermentation process for fault detection, it has been experimentally verified that the proposed model has an improved fault detection accuracy compared to the standard Transformer model.