基于强化学习的燃料电池电动汽车能源管理策略,考虑耦合能源退化问题

IF 4.8 2区 工程技术 Q2 ENERGY & FUELS
Weiwei Huo , Teng Liu , Bing Lu
{"title":"基于强化学习的燃料电池电动汽车能源管理策略,考虑耦合能源退化问题","authors":"Weiwei Huo ,&nbsp;Teng Liu ,&nbsp;Bing Lu","doi":"10.1016/j.segan.2024.101548","DOIUrl":null,"url":null,"abstract":"<div><div>An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.</div></div>","PeriodicalId":56142,"journal":{"name":"Sustainable Energy Grids & Networks","volume":"40 ","pages":"Article 101548"},"PeriodicalIF":4.8000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A reinforcement learning-based energy management strategy for fuel cell electric vehicle considering coupled-energy sources degradations\",\"authors\":\"Weiwei Huo ,&nbsp;Teng Liu ,&nbsp;Bing Lu\",\"doi\":\"10.1016/j.segan.2024.101548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.</div></div>\",\"PeriodicalId\":56142,\"journal\":{\"name\":\"Sustainable Energy Grids & Networks\",\"volume\":\"40 \",\"pages\":\"Article 101548\"},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sustainable Energy Grids & Networks\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352467724002777\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sustainable Energy Grids & Networks","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352467724002777","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0

摘要

有效的能源管理策略(EMS)对燃料电池电动汽车(FCEV)至关重要,它可以在车辆运行期间有效分配来自多种能源的电力,从而优化燃料消耗量并缓解燃料电池(FC)老化。质子交换膜燃料电池(PEMFC)具有高功率密度、近零排放和低腐蚀性等优点,是燃料电池汽车首选的主要动力源。然而,它价格昂贵,而且其寿命会受到快速功率波动的严重影响。为解决这一问题,提出了最小化瞬时成本(MIC)的方法,以降低 FC 负载突然变化的频率。此外,通过分析驾驶条件特征,集合袋装树(EBT)可促进复合条件的实时识别(WCI),从而增强 EMS 对各种运行条件的适应性。本文介绍了一种基于双延迟深度确定性策略梯度(TD3)深度强化学习的先进 EMS,它考虑了能源退化、经济效率和驾驶条件。训练结果表明,将基于 TD3 的策略与 WCI 和 MIC 相结合,不仅能使 FC 系统退化率降低 32.6%,还能降低总体运营成本,并显著加快算法收敛速度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A reinforcement learning-based energy management strategy for fuel cell electric vehicle considering coupled-energy sources degradations
An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Sustainable Energy Grids & Networks
Sustainable Energy Grids & Networks Energy-Energy Engineering and Power Technology
CiteScore
7.90
自引率
13.00%
发文量
206
审稿时长
49 days
期刊介绍: Sustainable Energy, Grids and Networks (SEGAN)is an international peer-reviewed publication for theoretical and applied research dealing with energy, information grids and power networks, including smart grids from super to micro grid scales. SEGAN welcomes papers describing fundamental advances in mathematical, statistical or computational methods with application to power and energy systems, as well as papers on applications, computation and modeling in the areas of electrical and energy systems with coupled information and communication technologies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信