{"title":"Adaptive reinforcement learning for energy management – A progressive approach to boost climate resilience and energy flexibility","authors":"Vahid M. Nik , Kavan Javanroodi","doi":"10.1016/j.adapen.2025.100213","DOIUrl":null,"url":null,"abstract":"<div><div>Energy management in urban areas is challenging due to diverse energy users, dynamics environmental conditions, and the added complexity and instability of extreme weather events. We incorporate adaptive reinforcement learning (ARL) into energy management (EM) and introduce a novel approach, called ARLEM. An online, value-based, model-free ARL engine is designed that updates its policy periodically and partially by replacing less favorable actions with those better adapted to evolving environmental conditions. Multiple policy update mechanisms are assessed, varying based on the frequency and length of updates and the action selection criteria. ARLEM is tested to control the energy performance of typical urban blocks in Madrid and Stockholm considering 17 future climate scenarios for 2040–2069. Each block contains 24 buildings of different types and ages. In Madrid, ARLEM is tested for a summer with two heatwaves and in Stockholm for a winter with two cold waves. Three performance indicators are defined to evaluate the effectiveness and resilience of different control approaches during extreme weather events. ARLEM demonstrates an ability to increase climate resilience in the studied blocks by increasing energy flexibility in the network and reducing both average and peak energy demands while affecting indoor thermal comfort marginally. Since the approach does not require any information about the system dynamics, it is easy to cope with the complexities of building systems and technologies, making it an affordable technology to control large urban areas with diverse types of buildings.</div></div>","PeriodicalId":34615,"journal":{"name":"Advances in Applied Energy","volume":"17 ","pages":"Article 100213"},"PeriodicalIF":13.0000,"publicationDate":"2025-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Applied Energy","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666792425000071","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
引用次数: 0
Abstract
Energy management in urban areas is challenging due to diverse energy users, dynamics environmental conditions, and the added complexity and instability of extreme weather events. We incorporate adaptive reinforcement learning (ARL) into energy management (EM) and introduce a novel approach, called ARLEM. An online, value-based, model-free ARL engine is designed that updates its policy periodically and partially by replacing less favorable actions with those better adapted to evolving environmental conditions. Multiple policy update mechanisms are assessed, varying based on the frequency and length of updates and the action selection criteria. ARLEM is tested to control the energy performance of typical urban blocks in Madrid and Stockholm considering 17 future climate scenarios for 2040–2069. Each block contains 24 buildings of different types and ages. In Madrid, ARLEM is tested for a summer with two heatwaves and in Stockholm for a winter with two cold waves. Three performance indicators are defined to evaluate the effectiveness and resilience of different control approaches during extreme weather events. ARLEM demonstrates an ability to increase climate resilience in the studied blocks by increasing energy flexibility in the network and reducing both average and peak energy demands while affecting indoor thermal comfort marginally. Since the approach does not require any information about the system dynamics, it is easy to cope with the complexities of building systems and technologies, making it an affordable technology to control large urban areas with diverse types of buildings.