Self-attention based Difference Long Short-Term Memory Network for Industrial Data-driven Modeling

IF 3.8 2区 化学 Q2 AUTOMATION & CONTROL SYSTEMS
Xiaoqing Zheng, Bo Peng, Anke Xue, Ming Ge, Yaguang Kong, Aipeng Jiang
{"title":"Self-attention based Difference Long Short-Term Memory Network for Industrial Data-driven Modeling","authors":"Xiaoqing Zheng,&nbsp;Bo Peng,&nbsp;Anke Xue,&nbsp;Ming Ge,&nbsp;Yaguang Kong,&nbsp;Aipeng Jiang","doi":"10.1016/j.chemolab.2025.105535","DOIUrl":null,"url":null,"abstract":"<div><div>In modern industry, soft sensors provide real-time predictions of quality variables that are difficult to measure directly with physical sensors. However, in industrial processes, changes in material properties, catalyst deactivation, and other factors often lead to shifts in data distribution. Existing soft sensor models often overlook the impact of these distribution changes on performance. To address the issue of performance degradation due to changes in data distribution, this paper proposes a self-attention based Difference Long Short-Term Memory (SA-DLSTM) network for soft sensor modeling. By employing self-attention, industrial raw data is refined to facilitate the extraction of nonlinear features, thereby reducing the difficulty in modeling. A Difference Channel is designed to perform correlation analysis and select significant features from the raw data, followed by extracting the difference information that can reveal changes in the data distribution. The SA-DLSTM soft sensor model is established and validated on two benchmark industrial datasets: Debutanizer Column and Sulfur Recovery Unit. Comparisons with benchmark models, and state-of-the-art models show that SA-DLSTM achieves the best performance across all evaluation metrics, demonstrating the effectiveness of the proposed model.</div></div>","PeriodicalId":9774,"journal":{"name":"Chemometrics and Intelligent Laboratory Systems","volume":"267 ","pages":"Article 105535"},"PeriodicalIF":3.8000,"publicationDate":"2025-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chemometrics and Intelligent Laboratory Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169743925002205","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

In modern industry, soft sensors provide real-time predictions of quality variables that are difficult to measure directly with physical sensors. However, in industrial processes, changes in material properties, catalyst deactivation, and other factors often lead to shifts in data distribution. Existing soft sensor models often overlook the impact of these distribution changes on performance. To address the issue of performance degradation due to changes in data distribution, this paper proposes a self-attention based Difference Long Short-Term Memory (SA-DLSTM) network for soft sensor modeling. By employing self-attention, industrial raw data is refined to facilitate the extraction of nonlinear features, thereby reducing the difficulty in modeling. A Difference Channel is designed to perform correlation analysis and select significant features from the raw data, followed by extracting the difference information that can reveal changes in the data distribution. The SA-DLSTM soft sensor model is established and validated on two benchmark industrial datasets: Debutanizer Column and Sulfur Recovery Unit. Comparisons with benchmark models, and state-of-the-art models show that SA-DLSTM achieves the best performance across all evaluation metrics, demonstrating the effectiveness of the proposed model.
基于自注意的差分长短期记忆网络用于工业数据驱动建模
在现代工业中,软传感器提供了难以用物理传感器直接测量的质量变量的实时预测。然而,在工业过程中,材料性质的变化、催化剂失活和其他因素往往会导致数据分布的变化。现有的软测量模型往往忽略了这些分布变化对性能的影响。为了解决由于数据分布变化导致的性能下降问题,本文提出了一种基于自注意的差分长短期记忆(SA-DLSTM)网络用于软传感器建模。利用自关注对工业原始数据进行细化,便于提取非线性特征,从而降低建模难度。差分通道(Difference Channel)的作用是从原始数据中进行相关性分析,选择显著特征,提取能够揭示数据分布变化的差分信息。建立了SA-DLSTM软测量模型,并在脱塔塔和硫回收装置两个基准工业数据集上进行了验证。与基准模型和最先进模型的比较表明,SA-DLSTM在所有评估指标中实现了最佳性能,证明了所提出模型的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.50
自引率
7.70%
发文量
169
审稿时长
3.4 months
期刊介绍: Chemometrics and Intelligent Laboratory Systems publishes original research papers, short communications, reviews, tutorials and Original Software Publications reporting on development of novel statistical, mathematical, or computer techniques in Chemistry and related disciplines. Chemometrics is the chemical discipline that uses mathematical and statistical methods to design or select optimal procedures and experiments, and to provide maximum chemical information by analysing chemical data. The journal deals with the following topics: 1) Development of new statistical, mathematical and chemometrical methods for Chemistry and related fields (Environmental Chemistry, Biochemistry, Toxicology, System Biology, -Omics, etc.) 2) Novel applications of chemometrics to all branches of Chemistry and related fields (typical domains of interest are: process data analysis, experimental design, data mining, signal processing, supervised modelling, decision making, robust statistics, mixture analysis, multivariate calibration etc.) Routine applications of established chemometrical techniques will not be considered. 3) Development of new software that provides novel tools or truly advances the use of chemometrical methods. 4) Well characterized data sets to test performance for the new methods and software. The journal complies with International Committee of Medical Journal Editors'' Uniform requirements for manuscripts.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信