Data Driven Real-Time Dynamic Voltage Control Using Decentralized Execution Multi-Agent Deep Reinforcement Learning

IF 3.3 Q3 ENERGY & FUELS
Yuling Wang;Vijay Vittal
{"title":"Data Driven Real-Time Dynamic Voltage Control Using Decentralized Execution Multi-Agent Deep Reinforcement Learning","authors":"Yuling Wang;Vijay Vittal","doi":"10.1109/OAJPE.2024.3459002","DOIUrl":null,"url":null,"abstract":"In recent years, there has been an increasing need for effective voltage control methods in power systems due to the growing complexity and dynamic nature of practical power grid operations. To enhance the controller’s resilience in addressing communication failures, a dynamic voltage control method employing distributed execution multi-agent deep reinforcement learning(DRL) is proposed. The proposed method follows a centralized training and decentralized execution based approach. Each agent has independent actor neural networks to output generator control commands and critic neural networks that evaluate command performance. Detailed dynamic models are integrated for agent training to effectively capture the system’s dynamic behavior following disturbances. Subsequent to training, each agent possesses the capability to autonomously generate control commands utilizing only local information. Simulation outcomes underscore the efficacy of the distributed execution multi-agent DRL controller, showcasing its capability in not only providing voltage support but also effectively handling communication failures among agents.","PeriodicalId":56187,"journal":{"name":"IEEE Open Access Journal of Power and Energy","volume":null,"pages":null},"PeriodicalIF":3.3000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10679222","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Access Journal of Power and Energy","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10679222/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, there has been an increasing need for effective voltage control methods in power systems due to the growing complexity and dynamic nature of practical power grid operations. To enhance the controller’s resilience in addressing communication failures, a dynamic voltage control method employing distributed execution multi-agent deep reinforcement learning(DRL) is proposed. The proposed method follows a centralized training and decentralized execution based approach. Each agent has independent actor neural networks to output generator control commands and critic neural networks that evaluate command performance. Detailed dynamic models are integrated for agent training to effectively capture the system’s dynamic behavior following disturbances. Subsequent to training, each agent possesses the capability to autonomously generate control commands utilizing only local information. Simulation outcomes underscore the efficacy of the distributed execution multi-agent DRL controller, showcasing its capability in not only providing voltage support but also effectively handling communication failures among agents.
利用分散执行多代理深度强化学习实现数据驱动的实时动态电压控制
近年来,由于实际电网运行的复杂性和动态性不断增加,电力系统中对有效电压控制方法的需求日益增长。为了增强控制器在处理通信故障时的应变能力,本文提出了一种采用分布式执行多代理深度强化学习(DRL)的动态电压控制方法。该方法采用基于集中训练和分散执行的方法。每个代理都有独立的行动者神经网络来输出发电机控制指令,并有批评者神经网络来评估指令性能。详细的动态模型被集成到代理训练中,以有效捕捉系统在受到干扰后的动态行为。经过训练后,每个代理都能仅利用本地信息自主生成控制指令。仿真结果表明了分布式执行多代理 DRL 控制器的功效,展示了它不仅能提供电压支持,还能有效处理代理之间的通信故障。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.80
自引率
5.30%
发文量
45
审稿时长
10 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信