Julen Cestero , Carmine Delle Femine , Kenji S. Muro , Marco Quartulli , Marcello Restelli
{"title":"Optimizing energy management of smart grid using reinforcement learning aided by surrogate models built using physics-informed neural networks","authors":"Julen Cestero , Carmine Delle Femine , Kenji S. Muro , Marco Quartulli , Marcello Restelli","doi":"10.1016/j.apenergy.2025.126750","DOIUrl":null,"url":null,"abstract":"<div><div>Optimizing the energy management within a smart grid scenario presents significant challenges, primarily due to the complexity of real-world systems and the intricate interactions among various components. Reinforcement Learning (RL) is gaining prominence as a solution for addressing the challenges of Optimal Power Flow (OPF) in smart grids. However, RL needs to iterate compulsively throughout a given environment to obtain the optimal policy. This means obtaining samples from a, most likely, costly simulator, which can lead to a sample efficiency problem. In this work, we address this problem by substituting costly smart grid simulators with surrogate models built using Physics-Informed Neural Networks (PINNs), optimizing the RL policy training process by arriving at convergent results in a fraction of the time employed by the original environment. Specifically, we tested the performance of our PINN surrogate against other state-of-the-art data-driven surrogates and found that the understanding of the underlying physical nature of the problem makes the PINN surrogate the only method we studied capable of learning a good RL policy, in addition to not having to use samples from the real simulator. Our work shows that, by employing PINN surrogates, we can improve training speed by 50 %, compared to training the RL policy without using any surrogate model, enabling us to achieve results with scores on par with the original simulator more rapidly.</div></div>","PeriodicalId":246,"journal":{"name":"Applied Energy","volume":"401 ","pages":"Article 126750"},"PeriodicalIF":11.0000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Energy","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306261925014801","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0
Abstract
Optimizing the energy management within a smart grid scenario presents significant challenges, primarily due to the complexity of real-world systems and the intricate interactions among various components. Reinforcement Learning (RL) is gaining prominence as a solution for addressing the challenges of Optimal Power Flow (OPF) in smart grids. However, RL needs to iterate compulsively throughout a given environment to obtain the optimal policy. This means obtaining samples from a, most likely, costly simulator, which can lead to a sample efficiency problem. In this work, we address this problem by substituting costly smart grid simulators with surrogate models built using Physics-Informed Neural Networks (PINNs), optimizing the RL policy training process by arriving at convergent results in a fraction of the time employed by the original environment. Specifically, we tested the performance of our PINN surrogate against other state-of-the-art data-driven surrogates and found that the understanding of the underlying physical nature of the problem makes the PINN surrogate the only method we studied capable of learning a good RL policy, in addition to not having to use samples from the real simulator. Our work shows that, by employing PINN surrogates, we can improve training speed by 50 %, compared to training the RL policy without using any surrogate model, enabling us to achieve results with scores on par with the original simulator more rapidly.
期刊介绍:
Applied Energy serves as a platform for sharing innovations, research, development, and demonstrations in energy conversion, conservation, and sustainable energy systems. The journal covers topics such as optimal energy resource use, environmental pollutant mitigation, and energy process analysis. It welcomes original papers, review articles, technical notes, and letters to the editor. Authors are encouraged to submit manuscripts that bridge the gap between research, development, and implementation. The journal addresses a wide spectrum of topics, including fossil and renewable energy technologies, energy economics, and environmental impacts. Applied Energy also explores modeling and forecasting, conservation strategies, and the social and economic implications of energy policies, including climate change mitigation. It is complemented by the open-access journal Advances in Applied Energy.