{"title":"Battery Scheduling in a Residential Multi-Carrier Energy System Using Reinforcement Learning","authors":"Brida V. Mbuwir, M. Kaffash, Geert Deconinck","doi":"10.1109/SmartGridComm.2018.8587412","DOIUrl":null,"url":null,"abstract":"Motivated by the recent developments in machine learning and artificial intelligence, this work contributes to the application of reinforcement learning in Multi-Carrier Energy Systems (MCESs) to provide flexibility at the residential level. The work addresses the problem of providing flexibility through the operation of a storage device, and flexibility of supply by considering several infrastructures to meet the residential thermal and electrical demand in a MCES with a photovoltaic (PV) installation. The problem of providing flexibility using a battery is formulated as a sequential decision making problem under uncertainty where, at every time step, the uncertainty is due to the lack of knowledge about future electricity demand and weather dependent PV production. This paper proposes to address this problem using fitted Q-iteration, a batch Reinforcement Learning (RL) algorithm. The proposed method is tested using data from a typical Belgian residential household. Simulation results show that, an optimal interaction of the different energy carriers in the system can be obtained using RL and without providing a detailed model of the MCES.","PeriodicalId":213523,"journal":{"name":"2018 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SmartGridComm.2018.8587412","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 20
Abstract
Motivated by the recent developments in machine learning and artificial intelligence, this work contributes to the application of reinforcement learning in Multi-Carrier Energy Systems (MCESs) to provide flexibility at the residential level. The work addresses the problem of providing flexibility through the operation of a storage device, and flexibility of supply by considering several infrastructures to meet the residential thermal and electrical demand in a MCES with a photovoltaic (PV) installation. The problem of providing flexibility using a battery is formulated as a sequential decision making problem under uncertainty where, at every time step, the uncertainty is due to the lack of knowledge about future electricity demand and weather dependent PV production. This paper proposes to address this problem using fitted Q-iteration, a batch Reinforcement Learning (RL) algorithm. The proposed method is tested using data from a typical Belgian residential household. Simulation results show that, an optimal interaction of the different energy carriers in the system can be obtained using RL and without providing a detailed model of the MCES.