Training A Deep Reinforcement Learning Agent for Microgrid Control using PSCAD Environment

A. Soofi, Reza Bayani, Mehrdad Yazdanibiouki, Saeed D. Manshadi
{"title":"Training A Deep Reinforcement Learning Agent for Microgrid Control using PSCAD Environment","authors":"A. Soofi, Reza Bayani, Mehrdad Yazdanibiouki, Saeed D. Manshadi","doi":"10.1109/GridEdge54130.2023.10102740","DOIUrl":null,"url":null,"abstract":"The accessibility of real-time operational data along with breakthroughs in processing power have promoted the use of Machine Learning (ML) applications in current power systems. Prediction of device failures, meteorological data, system outages, and demand are among the applications of ML in the electricity grid. In this paper, a Reinforcement Learning (RL) method is utilized to design an efficient energy management system for grid-tied Energy Storage Systems (ESS). We implement a Deep Q-Learning (DQL) approach using Artificial Neural Networks (ANN) to design a microgrid controller system simulated in the PSCAD environment. The proposed on-grid controller coordinates the main grid, aggregated loads, renewable generations, and Advanced Energy Storage (AES). To reduce the cost of operating AESs, the designed controller takes the hourly energy market price into account in addition to physical system characteristics.","PeriodicalId":377998,"journal":{"name":"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE PES Grid Edge Technologies Conference & Exposition (Grid Edge)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GridEdge54130.2023.10102740","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The accessibility of real-time operational data along with breakthroughs in processing power have promoted the use of Machine Learning (ML) applications in current power systems. Prediction of device failures, meteorological data, system outages, and demand are among the applications of ML in the electricity grid. In this paper, a Reinforcement Learning (RL) method is utilized to design an efficient energy management system for grid-tied Energy Storage Systems (ESS). We implement a Deep Q-Learning (DQL) approach using Artificial Neural Networks (ANN) to design a microgrid controller system simulated in the PSCAD environment. The proposed on-grid controller coordinates the main grid, aggregated loads, renewable generations, and Advanced Energy Storage (AES). To reduce the cost of operating AESs, the designed controller takes the hourly energy market price into account in addition to physical system characteristics.
基于PSCAD环境的微电网控制深度强化学习智能体训练
实时运行数据的可访问性以及处理能力的突破促进了机器学习(ML)应用在当前电力系统中的应用。预测设备故障、气象数据、系统中断和需求是机器学习在电网中的应用。本文利用强化学习(RL)方法设计了一种高效的并网储能系统(ESS)能量管理系统。我们使用人工神经网络(ANN)实现深度q -学习(DQL)方法来设计在PSCAD环境中模拟的微电网控制器系统。所提出的并网控制器协调主电网、聚合负荷、可再生发电和先进储能(AES)。为了降低AESs的运行成本,所设计的控制器除了考虑系统的物理特性外,还考虑了每小时的能源市场价格。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信