{"title":"Reinforcement learning for optimal energy management of a solar microgrid","authors":"R. Leo, R. S. Milton, S. Sibi","doi":"10.1109/GHTC-SAS.2014.6967580","DOIUrl":null,"url":null,"abstract":"In an optimization based control approach for solar microgrid energy management, consumer as an agent continuously interacts with the environment and learns to take optimal actions autonomously to reduce the power consumption from grid. Learning is built in directly into the consumer's behaviour so that he can decide and act in his own interest for optimal scheduling. The consumer evolves by interacting with the influencing variables of the environment. We consider a grid-connected solar microgrid system which contains a local consumer, a renewable generator (solar photovoltaic system) and a storage facility (battery). A model-free Reinforcement Learning algorithm, namely three-step-ahead Q-learning, is used to optimize the battery scheduling in dynamic environment of load and available solar power. Solar power and the load feed the reinforcement learning algorithm. By increasing the utility of battery and the solar power generator, an optimal performance of solar microgrid is achieved. Simulation results using real numerical data are presented for a reliability test of the system. The uncertainties in the solar power and the load are taken into account in the proposed control framework.","PeriodicalId":437025,"journal":{"name":"2014 IEEE Global Humanitarian Technology Conference - South Asia Satellite (GHTC-SAS)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"37","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Global Humanitarian Technology Conference - South Asia Satellite (GHTC-SAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GHTC-SAS.2014.6967580","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 37
Abstract
In an optimization based control approach for solar microgrid energy management, consumer as an agent continuously interacts with the environment and learns to take optimal actions autonomously to reduce the power consumption from grid. Learning is built in directly into the consumer's behaviour so that he can decide and act in his own interest for optimal scheduling. The consumer evolves by interacting with the influencing variables of the environment. We consider a grid-connected solar microgrid system which contains a local consumer, a renewable generator (solar photovoltaic system) and a storage facility (battery). A model-free Reinforcement Learning algorithm, namely three-step-ahead Q-learning, is used to optimize the battery scheduling in dynamic environment of load and available solar power. Solar power and the load feed the reinforcement learning algorithm. By increasing the utility of battery and the solar power generator, an optimal performance of solar microgrid is achieved. Simulation results using real numerical data are presented for a reliability test of the system. The uncertainties in the solar power and the load are taken into account in the proposed control framework.