{"title":"基于DRL的网络微电网调度双代理框架","authors":"Sujay A. Kaloti;Badrul H. Chowdhury","doi":"10.1109/TSTE.2025.3576153","DOIUrl":null,"url":null,"abstract":"The widely reported increase in the frequency of high impact, low probability extreme weather events pose significant challenges to the resilient operation of electric power systems. This paper explores strategies to enhance operational resilience that addresses the distribution network’s ability to adapt to changing operating conditions. We introduce a novel Dual Agent-Based framework for optimizing the scheduling of distributed energy resources (DERs) within a networked microgrid (N-MG) using the deep reinforcement learning (DRL) paradigm. This framework focuses on minimizing operational and environmental costs during normal operations while enhancing critical load supply indices (CSI) under emergency conditions. Additionally, we introduce a multi-temporal dynamic reward shaping structure along with the incorporation of an error coefficient to enhance the learning process of the agents. To appropriately manage loads during emergencies, we propose a load flexibility classification system that categorizes loads based on its criticality index. The scalability of the proposed approach is demonstrated through running multiple case-studies on a modified IEEE 123-node benchmark distribution network. Furthermore, validation of the method is provided by means of comparisons with two metaheuristic algorithms namely particle swarm optimization (PSO) and genetic algorithm (GA).","PeriodicalId":452,"journal":{"name":"IEEE Transactions on Sustainable Energy","volume":"16 4","pages":"2989-3002"},"PeriodicalIF":10.0000,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dual Agent Framework for Scheduling Networked Microgrids Using DRL to Improve Resilience\",\"authors\":\"Sujay A. Kaloti;Badrul H. Chowdhury\",\"doi\":\"10.1109/TSTE.2025.3576153\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The widely reported increase in the frequency of high impact, low probability extreme weather events pose significant challenges to the resilient operation of electric power systems. This paper explores strategies to enhance operational resilience that addresses the distribution network’s ability to adapt to changing operating conditions. We introduce a novel Dual Agent-Based framework for optimizing the scheduling of distributed energy resources (DERs) within a networked microgrid (N-MG) using the deep reinforcement learning (DRL) paradigm. This framework focuses on minimizing operational and environmental costs during normal operations while enhancing critical load supply indices (CSI) under emergency conditions. Additionally, we introduce a multi-temporal dynamic reward shaping structure along with the incorporation of an error coefficient to enhance the learning process of the agents. To appropriately manage loads during emergencies, we propose a load flexibility classification system that categorizes loads based on its criticality index. The scalability of the proposed approach is demonstrated through running multiple case-studies on a modified IEEE 123-node benchmark distribution network. Furthermore, validation of the method is provided by means of comparisons with two metaheuristic algorithms namely particle swarm optimization (PSO) and genetic algorithm (GA).\",\"PeriodicalId\":452,\"journal\":{\"name\":\"IEEE Transactions on Sustainable Energy\",\"volume\":\"16 4\",\"pages\":\"2989-3002\"},\"PeriodicalIF\":10.0000,\"publicationDate\":\"2025-06-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Sustainable Energy\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11021619/\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Sustainable Energy","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11021619/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
Dual Agent Framework for Scheduling Networked Microgrids Using DRL to Improve Resilience
The widely reported increase in the frequency of high impact, low probability extreme weather events pose significant challenges to the resilient operation of electric power systems. This paper explores strategies to enhance operational resilience that addresses the distribution network’s ability to adapt to changing operating conditions. We introduce a novel Dual Agent-Based framework for optimizing the scheduling of distributed energy resources (DERs) within a networked microgrid (N-MG) using the deep reinforcement learning (DRL) paradigm. This framework focuses on minimizing operational and environmental costs during normal operations while enhancing critical load supply indices (CSI) under emergency conditions. Additionally, we introduce a multi-temporal dynamic reward shaping structure along with the incorporation of an error coefficient to enhance the learning process of the agents. To appropriately manage loads during emergencies, we propose a load flexibility classification system that categorizes loads based on its criticality index. The scalability of the proposed approach is demonstrated through running multiple case-studies on a modified IEEE 123-node benchmark distribution network. Furthermore, validation of the method is provided by means of comparisons with two metaheuristic algorithms namely particle swarm optimization (PSO) and genetic algorithm (GA).
期刊介绍:
The IEEE Transactions on Sustainable Energy serves as a pivotal platform for sharing groundbreaking research findings on sustainable energy systems, with a focus on their seamless integration into power transmission and/or distribution grids. The journal showcases original research spanning the design, implementation, grid-integration, and control of sustainable energy technologies and systems. Additionally, the Transactions warmly welcomes manuscripts addressing the design, implementation, and evaluation of power systems influenced by sustainable energy systems and devices.