Online improving NOX concentration prediction at SCR system outlet through continual learning

IF 6.3 3区 工程技术 Q1 ENGINEERING, CHEMICAL
Peng Chen , Baochang Xu , Wei He , Hongtao Hu
{"title":"Online improving NOX concentration prediction at SCR system outlet through continual learning","authors":"Peng Chen ,&nbsp;Baochang Xu ,&nbsp;Wei He ,&nbsp;Hongtao Hu","doi":"10.1016/j.jtice.2025.106417","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Accurately predicting NO<sub>X</sub> concentration at SCR system outlet is crucial for optimizing process parameters and reducing NO<sub>X</sub> emissions. Existing prediction models often struggle to maintain accurate long-term online predictions when operating conditions change and new data arrive. Therefore, frequent model updates are required in practical applications. However, new models often exhibit catastrophic forgetting of learned patterns, leading to a deterioration in NO<sub>X</sub> concentration prediction accuracy.</div></div><div><h3>Methods</h3><div>A novel continual learning algorithm, termed TDIR, is proposed. This algorithm dynamically identifies historical temporal samples that are most susceptible to interference from new data and prioritizes their replay. The TDIR algorithm is integrated with the iTransformer architecture to establish TDIRformer, an online self-updating model for NO<sub>X</sub> emission prediction. The model utilizes variate tokens to capture cross-feature correlations, effectively addressing prediction inaccuracies caused by multivariate coupling. It also employs the TDIR algorithm for online updates, which mitigates catastrophic forgetting and improves the prediction accuracy of NO<sub>X</sub> concentrations.</div></div><div><h3>Significant findings</h3><div>Experimental results show that TDIRformer significantly outperforms LSTM, Informer, PatchTST, and iTransformer, while TDIR also surpasses the continual learning methods EWC and MAS. Additionally, the TDIR algorithm demonstrates strong generalization capability, achieving RMSE reductions of 10.7 %, 9.3 %, and 11.9 % when applied to LSTM, Informer, and PatchTST models, respectively.</div></div>","PeriodicalId":381,"journal":{"name":"Journal of the Taiwan Institute of Chemical Engineers","volume":"179 ","pages":"Article 106417"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Taiwan Institute of Chemical Engineers","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1876107025004675","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CHEMICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Background

Accurately predicting NOX concentration at SCR system outlet is crucial for optimizing process parameters and reducing NOX emissions. Existing prediction models often struggle to maintain accurate long-term online predictions when operating conditions change and new data arrive. Therefore, frequent model updates are required in practical applications. However, new models often exhibit catastrophic forgetting of learned patterns, leading to a deterioration in NOX concentration prediction accuracy.

Methods

A novel continual learning algorithm, termed TDIR, is proposed. This algorithm dynamically identifies historical temporal samples that are most susceptible to interference from new data and prioritizes their replay. The TDIR algorithm is integrated with the iTransformer architecture to establish TDIRformer, an online self-updating model for NOX emission prediction. The model utilizes variate tokens to capture cross-feature correlations, effectively addressing prediction inaccuracies caused by multivariate coupling. It also employs the TDIR algorithm for online updates, which mitigates catastrophic forgetting and improves the prediction accuracy of NOX concentrations.

Significant findings

Experimental results show that TDIRformer significantly outperforms LSTM, Informer, PatchTST, and iTransformer, while TDIR also surpasses the continual learning methods EWC and MAS. Additionally, the TDIR algorithm demonstrates strong generalization capability, achieving RMSE reductions of 10.7 %, 9.3 %, and 11.9 % when applied to LSTM, Informer, and PatchTST models, respectively.

Abstract Image

通过持续学习,在线改进SCR系统出口NOX浓度预测
准确预测SCR系统出口NOX浓度对于优化工艺参数和减少NOX排放至关重要。当操作条件发生变化和新数据到来时,现有的预测模型往往难以保持准确的长期在线预测。因此,在实际应用中需要频繁的模型更新。然而,新模型往往表现出对所学模式的灾难性遗忘,导致NOX浓度预测精度下降。方法提出了一种新的连续学习算法TDIR。该算法动态识别最容易受到新数据干扰的历史时间样本,并对其重播进行优先排序。将TDIR算法与ittransformer架构相结合,建立了用于NOX排放预测的在线自更新模型TDIRformer。该模型利用变量标记来捕获跨特征相关性,有效地解决了由多变量耦合引起的预测不准确性。它还采用TDIR算法进行在线更新,减轻了灾难性遗忘,提高了氮氧化物浓度的预测精度。实验结果表明,TDIRformer显著优于LSTM、Informer、PatchTST和iTransformer, TDIR也优于持续学习方法EWC和MAS。此外,TDIR算法表现出较强的泛化能力,应用于LSTM、Informer和PatchTST模型时,RMSE分别降低了10.7%、9.3%和11.9%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.10
自引率
14.00%
发文量
362
审稿时长
35 days
期刊介绍: Journal of the Taiwan Institute of Chemical Engineers (formerly known as Journal of the Chinese Institute of Chemical Engineers) publishes original works, from fundamental principles to practical applications, in the broad field of chemical engineering with special focus on three aspects: Chemical and Biomolecular Science and Technology, Energy and Environmental Science and Technology, and Materials Science and Technology. Authors should choose for their manuscript an appropriate aspect section and a few related classifications when submitting to the journal online.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信