{"title":"基于强化学习的燃料电池电动汽车能源管理策略,考虑耦合能源退化问题","authors":"Weiwei Huo , Teng Liu , Bing Lu","doi":"10.1016/j.segan.2024.101548","DOIUrl":null,"url":null,"abstract":"<div><div>An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.</div></div>","PeriodicalId":56142,"journal":{"name":"Sustainable Energy Grids & Networks","volume":"40 ","pages":"Article 101548"},"PeriodicalIF":4.8000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A reinforcement learning-based energy management strategy for fuel cell electric vehicle considering coupled-energy sources degradations\",\"authors\":\"Weiwei Huo , Teng Liu , Bing Lu\",\"doi\":\"10.1016/j.segan.2024.101548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.</div></div>\",\"PeriodicalId\":56142,\"journal\":{\"name\":\"Sustainable Energy Grids & Networks\",\"volume\":\"40 \",\"pages\":\"Article 101548\"},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sustainable Energy Grids & Networks\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352467724002777\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sustainable Energy Grids & Networks","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352467724002777","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0
摘要
有效的能源管理策略(EMS)对燃料电池电动汽车(FCEV)至关重要,它可以在车辆运行期间有效分配来自多种能源的电力,从而优化燃料消耗量并缓解燃料电池(FC)老化。质子交换膜燃料电池(PEMFC)具有高功率密度、近零排放和低腐蚀性等优点,是燃料电池汽车首选的主要动力源。然而,它价格昂贵,而且其寿命会受到快速功率波动的严重影响。为解决这一问题,提出了最小化瞬时成本(MIC)的方法,以降低 FC 负载突然变化的频率。此外,通过分析驾驶条件特征,集合袋装树(EBT)可促进复合条件的实时识别(WCI),从而增强 EMS 对各种运行条件的适应性。本文介绍了一种基于双延迟深度确定性策略梯度(TD3)深度强化学习的先进 EMS,它考虑了能源退化、经济效率和驾驶条件。训练结果表明,将基于 TD3 的策略与 WCI 和 MIC 相结合,不仅能使 FC 系统退化率降低 32.6%,还能降低总体运营成本,并显著加快算法收敛速度。
A reinforcement learning-based energy management strategy for fuel cell electric vehicle considering coupled-energy sources degradations
An effective energy management strategy (EMS) is crucial for fuel cell electric vehicles (FCEVs) to optimize fuel consumption and mitigate fuel cell (FC) aging by efficiently distributing power from multiple energy sources during vehicle operation. The Proton Exchange Membrane Fuel Cell (PEMFC) is a preferred main power source for fuel cell vehicles due to its high power density, near-zero emissions, and low corrosivity. However, it is expensive, and its lifespan is significantly affected by rapid power fluctuations. To address this issue, the proposed method of minimizing instantaneous cost (MIC) reduces the frequency of abrupt changes in the FC load. Additionally, by analyzing driving condition characteristics, the Ensemble Bagging Tree (EBT) facilitates real-time recognition (WCI) of composite conditions, thereby enhancing the EMS's adaptability to various operating conditions. This paper introduces an advanced EMS based on double-delay deep deterministic policy gradient (TD3) deep reinforcement learning, which considers energy degradation, economic efficiency, and driving conditions. Training results indicate that the TD3-based policy, when integrated with WCI and MIC, not only achieves a 32.6 % reduction in FC system degradation but also lowers overall operational costs and significantly accelerates algorithm convergence.
期刊介绍:
Sustainable Energy, Grids and Networks (SEGAN)is an international peer-reviewed publication for theoretical and applied research dealing with energy, information grids and power networks, including smart grids from super to micro grid scales. SEGAN welcomes papers describing fundamental advances in mathematical, statistical or computational methods with application to power and energy systems, as well as papers on applications, computation and modeling in the areas of electrical and energy systems with coupled information and communication technologies.