An Interpretable Light Attention–Convolution–Gate Recurrent Unit Architecture for the Highly Accurate Modeling of Actual Chemical Dynamic Processes

IF 10.1 1区 工程技术 Q1 ENGINEERING, MULTIDISCIPLINARY
Yue Li , Ning Li , Jingzheng Ren , Weifeng Shen
{"title":"An Interpretable Light Attention–Convolution–Gate Recurrent Unit Architecture for the Highly Accurate Modeling of Actual Chemical Dynamic Processes","authors":"Yue Li ,&nbsp;Ning Li ,&nbsp;Jingzheng Ren ,&nbsp;Weifeng Shen","doi":"10.1016/j.eng.2024.07.009","DOIUrl":null,"url":null,"abstract":"<div><p>To equip data-driven dynamic chemical process models with strong interpretability, we develop a light attention–convolution–gate recurrent unit (LACG) architecture with three sub-modules—a basic module, a brand-new light attention module, and a residue module—that are specially designed to learn the general dynamic behavior, transient disturbances, and other input factors of chemical processes, respectively. Combined with a hyperparameter optimization framework, Optuna, the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process. The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models, including the feedforward neural network, convolution neural network, long short-term memory (LSTM), and attention-LSTM. Moreover, compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1, the LACG parameters are demonstrated to be interpretable, and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM. This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling, paving a route to intelligent manufacturing.</p></div>","PeriodicalId":11783,"journal":{"name":"Engineering","volume":null,"pages":null},"PeriodicalIF":10.1000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2095809924003989/pdfft?md5=8a3c739a3b730516d835f58e0c61ebce&pid=1-s2.0-S2095809924003989-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2095809924003989","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

To equip data-driven dynamic chemical process models with strong interpretability, we develop a light attention–convolution–gate recurrent unit (LACG) architecture with three sub-modules—a basic module, a brand-new light attention module, and a residue module—that are specially designed to learn the general dynamic behavior, transient disturbances, and other input factors of chemical processes, respectively. Combined with a hyperparameter optimization framework, Optuna, the effectiveness of the proposed LACG is tested by distributed control system data-driven modeling experiments on the discharge flowrate of an actual deethanization process. The LACG model provides significant advantages in prediction accuracy and model generalization compared with other models, including the feedforward neural network, convolution neural network, long short-term memory (LSTM), and attention-LSTM. Moreover, compared with the simulation results of a deethanization model built using Aspen Plus Dynamics V12.1, the LACG parameters are demonstrated to be interpretable, and more details on the variable interactions can be observed from the model parameters in comparison with the traditional interpretable model attention-LSTM. This contribution enriches interpretable machine learning knowledge and provides a reliable method with high accuracy for actual chemical process modeling, paving a route to intelligent manufacturing.

用于高精度模拟实际化学动态过程的可解释光注意-卷积-门递归单元结构
为了使数据驱动的动态化学过程模型具有较强的可解释性,我们开发了一种轻注意力-卷积-门递归单元(LACG)架构,其中包含三个子模块--基本模块、全新的轻注意力模块和残差模块--它们是专门为学习化学过程的一般动态行为、瞬态干扰和其他输入因素而设计的。结合超参数优化框架 Optuna,通过对实际脱乙烷过程的排放流量进行分布式控制系统数据驱动建模实验,检验了所提出的 LACG 的有效性。与其他模型(包括前馈神经网络、卷积神经网络、长短期记忆(LSTM)和注意力-LSTM)相比,LACG 模型在预测精度和模型泛化方面具有显著优势。此外,与使用 Aspen Plus Dynamics V12.1 建立的去乙烷化模型的仿真结果相比,LACG 参数被证明是可解释的,与传统的可解释模型 attention-LSTM 相比,从模型参数中可以观察到更多变量交互的细节。这一贡献丰富了可解释机器学习知识,为实际化学过程建模提供了一种高精度的可靠方法,为智能制造铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Engineering
Engineering Environmental Science-Environmental Engineering
自引率
1.60%
发文量
335
审稿时长
35 days
期刊介绍: Engineering, an international open-access journal initiated by the Chinese Academy of Engineering (CAE) in 2015, serves as a distinguished platform for disseminating cutting-edge advancements in engineering R&D, sharing major research outputs, and highlighting key achievements worldwide. The journal's objectives encompass reporting progress in engineering science, fostering discussions on hot topics, addressing areas of interest, challenges, and prospects in engineering development, while considering human and environmental well-being and ethics in engineering. It aims to inspire breakthroughs and innovations with profound economic and social significance, propelling them to advanced international standards and transforming them into a new productive force. Ultimately, this endeavor seeks to bring about positive changes globally, benefit humanity, and shape a new future.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信