{"title":"不确定条件下电动汽车车队充电管理控制策略比较","authors":"Zhewei Zhang , Rémy Rigo-Mariani , Nouredine Hadjsaid","doi":"10.1016/j.egyai.2025.100522","DOIUrl":null,"url":null,"abstract":"<div><div>The growing penetration of Electric Vehicles (EVs) in transportation brings challenges to power distribution systems due to uncertain usage patterns and increased peak loads. Effective EV fleet charging management strategies are needed to minimize network impacts, such as peak charging power. While existing studies have addressed uncertainties in future arrivals, they often overlook the uncertainties in user-provided inputs of current ongoing charging EVs, such as estimated departure time and energy demand. This paper analyzes the impact of these uncertainties and evaluates three management strategies: a baseline Model Predictive Control (MPC), a data-hybrid MPC, and a fully data-driven Deep Reinforcement Learning (DRL) approach. For data-hybrid MPC, we adopted a diffusion model to handle user input uncertainties and a Gaussian Mixture Model for modeling arrival/departure scenarios. Additionally, the DRL method is based on a Partially Observable Markov Decision Process (POMDP) to manage uncertainty and employs a Convolutional Neural Network (CNN) for feature extraction. Robustness tests under different user uncertainty levels show that the data hybrid MPC performs better on the baseline MPC by 20 %, while the DRL-based method achieves around 10 % improvement.</div></div>","PeriodicalId":34138,"journal":{"name":"Energy and AI","volume":"21 ","pages":"Article 100522"},"PeriodicalIF":9.6000,"publicationDate":"2025-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Comparative of control strategies on electrical vehicle fleet charging management strategies under uncertainties\",\"authors\":\"Zhewei Zhang , Rémy Rigo-Mariani , Nouredine Hadjsaid\",\"doi\":\"10.1016/j.egyai.2025.100522\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The growing penetration of Electric Vehicles (EVs) in transportation brings challenges to power distribution systems due to uncertain usage patterns and increased peak loads. Effective EV fleet charging management strategies are needed to minimize network impacts, such as peak charging power. While existing studies have addressed uncertainties in future arrivals, they often overlook the uncertainties in user-provided inputs of current ongoing charging EVs, such as estimated departure time and energy demand. This paper analyzes the impact of these uncertainties and evaluates three management strategies: a baseline Model Predictive Control (MPC), a data-hybrid MPC, and a fully data-driven Deep Reinforcement Learning (DRL) approach. For data-hybrid MPC, we adopted a diffusion model to handle user input uncertainties and a Gaussian Mixture Model for modeling arrival/departure scenarios. Additionally, the DRL method is based on a Partially Observable Markov Decision Process (POMDP) to manage uncertainty and employs a Convolutional Neural Network (CNN) for feature extraction. Robustness tests under different user uncertainty levels show that the data hybrid MPC performs better on the baseline MPC by 20 %, while the DRL-based method achieves around 10 % improvement.</div></div>\",\"PeriodicalId\":34138,\"journal\":{\"name\":\"Energy and AI\",\"volume\":\"21 \",\"pages\":\"Article 100522\"},\"PeriodicalIF\":9.6000,\"publicationDate\":\"2025-05-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Energy and AI\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666546825000540\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy and AI","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666546825000540","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Comparative of control strategies on electrical vehicle fleet charging management strategies under uncertainties
The growing penetration of Electric Vehicles (EVs) in transportation brings challenges to power distribution systems due to uncertain usage patterns and increased peak loads. Effective EV fleet charging management strategies are needed to minimize network impacts, such as peak charging power. While existing studies have addressed uncertainties in future arrivals, they often overlook the uncertainties in user-provided inputs of current ongoing charging EVs, such as estimated departure time and energy demand. This paper analyzes the impact of these uncertainties and evaluates three management strategies: a baseline Model Predictive Control (MPC), a data-hybrid MPC, and a fully data-driven Deep Reinforcement Learning (DRL) approach. For data-hybrid MPC, we adopted a diffusion model to handle user input uncertainties and a Gaussian Mixture Model for modeling arrival/departure scenarios. Additionally, the DRL method is based on a Partially Observable Markov Decision Process (POMDP) to manage uncertainty and employs a Convolutional Neural Network (CNN) for feature extraction. Robustness tests under different user uncertainty levels show that the data hybrid MPC performs better on the baseline MPC by 20 %, while the DRL-based method achieves around 10 % improvement.