Amin Shojaeighadikolaei, Arman Ghasemi, Alexandru G. Bardas, R. Ahmadi, M. Hashemi
{"title":"使用深度强化学习的天气感知数据驱动的微电网能源管理","authors":"Amin Shojaeighadikolaei, Arman Ghasemi, Alexandru G. Bardas, R. Ahmadi, M. Hashemi","doi":"10.1109/NAPS52732.2021.9654550","DOIUrl":null,"url":null,"abstract":"In this paper, we develop a deep reinforcement learning (DRL) framework to manage distributed energy resources (DER) in a prosumer-centric microgrid under generation uncertainties. The uncertainty stems from varying weather conditions (i.e., sunny versus cloudy days) that impact the power generation of the residential solar photo-voltaic (PV) panels. In our proposed system model, the microgrid consists of traditional power consumers, prosumers with local battery storage, and the distributor. The prosumers and distributor are equipped with artificial intelligence (AI) agents that interact with each other to maximize their long-term reward. We investigate the impact of weather conditions on the energy storage charging/discharging, as well as the amount of power injected into the microgrid by the prosumers. To show the efficacy of the proposed approach, we implement the DRL framework using Deep-Q Network (DQN). Our numerical results demonstrate that the proposed distributed energy management algorithm can efficiently cope with the generation uncertainties, and it is robust to weather prediction errors. Finally, our results show that adopting energy storage systems on the residential side can alleviate the power curtailment during generation surplus.","PeriodicalId":123077,"journal":{"name":"2021 North American Power Symposium (NAPS)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Weather-Aware Data-Driven Microgrid Energy Management Using Deep Reinforcement Learning\",\"authors\":\"Amin Shojaeighadikolaei, Arman Ghasemi, Alexandru G. Bardas, R. Ahmadi, M. Hashemi\",\"doi\":\"10.1109/NAPS52732.2021.9654550\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we develop a deep reinforcement learning (DRL) framework to manage distributed energy resources (DER) in a prosumer-centric microgrid under generation uncertainties. The uncertainty stems from varying weather conditions (i.e., sunny versus cloudy days) that impact the power generation of the residential solar photo-voltaic (PV) panels. In our proposed system model, the microgrid consists of traditional power consumers, prosumers with local battery storage, and the distributor. The prosumers and distributor are equipped with artificial intelligence (AI) agents that interact with each other to maximize their long-term reward. We investigate the impact of weather conditions on the energy storage charging/discharging, as well as the amount of power injected into the microgrid by the prosumers. To show the efficacy of the proposed approach, we implement the DRL framework using Deep-Q Network (DQN). Our numerical results demonstrate that the proposed distributed energy management algorithm can efficiently cope with the generation uncertainties, and it is robust to weather prediction errors. Finally, our results show that adopting energy storage systems on the residential side can alleviate the power curtailment during generation surplus.\",\"PeriodicalId\":123077,\"journal\":{\"name\":\"2021 North American Power Symposium (NAPS)\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 North American Power Symposium (NAPS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NAPS52732.2021.9654550\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 North American Power Symposium (NAPS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NAPS52732.2021.9654550","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Weather-Aware Data-Driven Microgrid Energy Management Using Deep Reinforcement Learning
In this paper, we develop a deep reinforcement learning (DRL) framework to manage distributed energy resources (DER) in a prosumer-centric microgrid under generation uncertainties. The uncertainty stems from varying weather conditions (i.e., sunny versus cloudy days) that impact the power generation of the residential solar photo-voltaic (PV) panels. In our proposed system model, the microgrid consists of traditional power consumers, prosumers with local battery storage, and the distributor. The prosumers and distributor are equipped with artificial intelligence (AI) agents that interact with each other to maximize their long-term reward. We investigate the impact of weather conditions on the energy storage charging/discharging, as well as the amount of power injected into the microgrid by the prosumers. To show the efficacy of the proposed approach, we implement the DRL framework using Deep-Q Network (DQN). Our numerical results demonstrate that the proposed distributed energy management algorithm can efficiently cope with the generation uncertainties, and it is robust to weather prediction errors. Finally, our results show that adopting energy storage systems on the residential side can alleviate the power curtailment during generation surplus.