A. Soofi, Reza Bayani, Mehrdad Yazdanibiouki, Saeed D. Manshadi
{"title":"基于PSCAD环境的微电网控制深度强化学习智能体训练","authors":"A. Soofi, Reza Bayani, Mehrdad Yazdanibiouki, Saeed D. Manshadi","doi":"10.1109/GridEdge54130.2023.10102740","DOIUrl":null,"url":null,"abstract":"The accessibility of real-time operational data along with breakthroughs in processing power have promoted the use of Machine Learning (ML) applications in current power systems. Prediction of device failures, meteorological data, system outages, and demand are among the applications of ML in the electricity grid. In this paper, a Reinforcement Learning (RL) method is utilized to design an efficient energy management system for grid-tied Energy Storage Systems (ESS). We implement a Deep Q-Learning (DQL) approach using Artificial Neural Networks (ANN) to design a microgrid controller system simulated in the PSCAD environment. The proposed on-grid controller coordinates the main grid, aggregated loads, renewable generations, and Advanced Energy Storage (AES). To reduce the cost of operating AESs, the designed controller takes the hourly energy market price into account in addition to physical system characteristics.","PeriodicalId":377998,"journal":{"name":"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Training A Deep Reinforcement Learning Agent for Microgrid Control using PSCAD Environment\",\"authors\":\"A. Soofi, Reza Bayani, Mehrdad Yazdanibiouki, Saeed D. Manshadi\",\"doi\":\"10.1109/GridEdge54130.2023.10102740\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The accessibility of real-time operational data along with breakthroughs in processing power have promoted the use of Machine Learning (ML) applications in current power systems. Prediction of device failures, meteorological data, system outages, and demand are among the applications of ML in the electricity grid. In this paper, a Reinforcement Learning (RL) method is utilized to design an efficient energy management system for grid-tied Energy Storage Systems (ESS). We implement a Deep Q-Learning (DQL) approach using Artificial Neural Networks (ANN) to design a microgrid controller system simulated in the PSCAD environment. The proposed on-grid controller coordinates the main grid, aggregated loads, renewable generations, and Advanced Energy Storage (AES). To reduce the cost of operating AESs, the designed controller takes the hourly energy market price into account in addition to physical system characteristics.\",\"PeriodicalId\":377998,\"journal\":{\"name\":\"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GridEdge54130.2023.10102740\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GridEdge54130.2023.10102740","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Training A Deep Reinforcement Learning Agent for Microgrid Control using PSCAD Environment
The accessibility of real-time operational data along with breakthroughs in processing power have promoted the use of Machine Learning (ML) applications in current power systems. Prediction of device failures, meteorological data, system outages, and demand are among the applications of ML in the electricity grid. In this paper, a Reinforcement Learning (RL) method is utilized to design an efficient energy management system for grid-tied Energy Storage Systems (ESS). We implement a Deep Q-Learning (DQL) approach using Artificial Neural Networks (ANN) to design a microgrid controller system simulated in the PSCAD environment. The proposed on-grid controller coordinates the main grid, aggregated loads, renewable generations, and Advanced Energy Storage (AES). To reduce the cost of operating AESs, the designed controller takes the hourly energy market price into account in addition to physical system characteristics.