Investigation of lifelong learning methods with elastic weight consolidation (EWC) for low-temperature ORC scroll expander modeling

IF 6.9 2区 工程技术 Q2 ENERGY & FUELS
Haohan Sha , Haiming Yu , Qingxu Ma , Yalong Yang , Yu Feng , Siyi Luo
{"title":"Investigation of lifelong learning methods with elastic weight consolidation (EWC) for low-temperature ORC scroll expander modeling","authors":"Haohan Sha ,&nbsp;Haiming Yu ,&nbsp;Qingxu Ma ,&nbsp;Yalong Yang ,&nbsp;Yu Feng ,&nbsp;Siyi Luo","doi":"10.1016/j.applthermaleng.2025.127359","DOIUrl":null,"url":null,"abstract":"<div><div>This study applied a lifelong learning method, elastic weight consolidation (EWC), to enhance the memory stability and plasticity of deep learning models for organic Rankine cycle (ORC) expander modeling. We conducted multiple experiments using a low-temperature ORC power system with a scroll expander. Three deep-learning models were developed to predict the power output, exhaust temperature, and flow rate. Six task sequences consisting of combinations of three tasks were generated. Joint training (JT), fine-tuning (FT), and EWC were evaluated using these task sequences. The overall performance of the EWC and FT methods on model plasticity was close to that of the JT method across all three predictions (e.g., the largest MSE difference was only 61 for power prediction), but the performance of them on model memory stability was inferior to that of JT. Compared with the FT method, the EWC method showed good improvement, limited improvement, and no improvement in model memory stability for power, exhaust temperature, and flow rate prediction, respectively. In terms of MSE, the average performance increased by 18% for power prediction, but reduced by 76% for flow rate prediction. The effects of dataset sizes on the performance of lifelong learning were also evaluated. The performance of the EWC and FT methods stabilized when the dataset size reached approximately 300, which means that there is a threshold of dataset sizes for EWC practical application. The study results advance the application of deep learning modeling and offer insights into using EWC for scroll expander modeling.</div></div>","PeriodicalId":8201,"journal":{"name":"Applied Thermal Engineering","volume":"278 ","pages":"Article 127359"},"PeriodicalIF":6.9000,"publicationDate":"2025-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Thermal Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1359431125019519","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0

Abstract

This study applied a lifelong learning method, elastic weight consolidation (EWC), to enhance the memory stability and plasticity of deep learning models for organic Rankine cycle (ORC) expander modeling. We conducted multiple experiments using a low-temperature ORC power system with a scroll expander. Three deep-learning models were developed to predict the power output, exhaust temperature, and flow rate. Six task sequences consisting of combinations of three tasks were generated. Joint training (JT), fine-tuning (FT), and EWC were evaluated using these task sequences. The overall performance of the EWC and FT methods on model plasticity was close to that of the JT method across all three predictions (e.g., the largest MSE difference was only 61 for power prediction), but the performance of them on model memory stability was inferior to that of JT. Compared with the FT method, the EWC method showed good improvement, limited improvement, and no improvement in model memory stability for power, exhaust temperature, and flow rate prediction, respectively. In terms of MSE, the average performance increased by 18% for power prediction, but reduced by 76% for flow rate prediction. The effects of dataset sizes on the performance of lifelong learning were also evaluated. The performance of the EWC and FT methods stabilized when the dataset size reached approximately 300, which means that there is a threshold of dataset sizes for EWC practical application. The study results advance the application of deep learning modeling and offer insights into using EWC for scroll expander modeling.
基于弹性权固结(EWC)的低温ORC涡旋扩展机建模终身学习方法研究
本研究采用弹性权重巩固(elastic weight consolidation, EWC)终身学习方法,增强深度学习模型在有机朗肯循环(ORC)扩展器建模中的记忆稳定性和可塑性。我们使用带涡旋扩展器的低温ORC电源系统进行了多次实验。开发了三个深度学习模型来预测功率输出、排气温度和流量。生成了由三个任务组合而成的六个任务序列。使用这些任务序列对联合训练(JT)、微调(FT)和EWC进行评估。在三种预测中,EWC和FT方法在模型可塑性方面的总体表现与JT方法接近(例如,功率预测的最大MSE差异仅为61),但在模型记忆稳定性方面的表现不如JT方法。与FT方法相比,EWC方法在功率预测、排气温度预测和流量预测方面分别表现出较好的改善、有限的改善和无改善的模型记忆稳定性。在MSE方面,功率预测的平均性能提高了18%,但流量预测的平均性能降低了76%。还评估了数据集大小对终身学习性能的影响。当数据集大小达到300左右时,EWC和FT方法的性能趋于稳定,这意味着EWC的实际应用存在一个数据集大小的阈值。研究结果促进了深度学习建模的应用,并为利用EWC进行滚动扩展机建模提供了新的思路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Applied Thermal Engineering
Applied Thermal Engineering 工程技术-工程:机械
CiteScore
11.30
自引率
15.60%
发文量
1474
审稿时长
57 days
期刊介绍: Applied Thermal Engineering disseminates novel research related to the design, development and demonstration of components, devices, equipment, technologies and systems involving thermal processes for the production, storage, utilization and conservation of energy, with a focus on engineering application. The journal publishes high-quality and high-impact Original Research Articles, Review Articles, Short Communications and Letters to the Editor on cutting-edge innovations in research, and recent advances or issues of interest to the thermal engineering community.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信