Prior-Information Enhanced Reinforcement Learning for Energy Management Systems

Th´eo Zangato, A. Osmani, Pegah Alizadeh
{"title":"Prior-Information Enhanced Reinforcement Learning for Energy Management Systems","authors":"Th´eo Zangato, A. Osmani, Pegah Alizadeh","doi":"10.5121/csit.2024.140207","DOIUrl":null,"url":null,"abstract":"Amidst increasing energy demands and growing environmental concerns, the promotion of sustainable and energy-efficient practices has become imperative. This paper introduces a reinforcement learning-based technique for optimizing energy consumption and its associated costs, with a focus on energy management systems. A three-step approach for the efficient management of charging cycles in energy storage units within buildings is presented combining RL with prior knowledge. A unique strategy is adopted: clustering building load curves to discern typical energy consumption patterns, embedding domain knowledge into the learning algorithm to refine the agent’s action space and predicting of future observations to make real-time decisions. We showcase the effectiveness of our method using real-world data. It enables controlled exploration and efficient training of Energy Management System (EMS) agents. When compared to the benchmark, our model reduces energy costs by up to 15%, cutting down consumption during peak periods, and demonstrating adaptability across various building consumption profiles.","PeriodicalId":104179,"journal":{"name":"AI, Machine Learning and Applications","volume":"83 1-2","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AI, Machine Learning and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2024.140207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Amidst increasing energy demands and growing environmental concerns, the promotion of sustainable and energy-efficient practices has become imperative. This paper introduces a reinforcement learning-based technique for optimizing energy consumption and its associated costs, with a focus on energy management systems. A three-step approach for the efficient management of charging cycles in energy storage units within buildings is presented combining RL with prior knowledge. A unique strategy is adopted: clustering building load curves to discern typical energy consumption patterns, embedding domain knowledge into the learning algorithm to refine the agent’s action space and predicting of future observations to make real-time decisions. We showcase the effectiveness of our method using real-world data. It enables controlled exploration and efficient training of Energy Management System (EMS) agents. When compared to the benchmark, our model reduces energy costs by up to 15%, cutting down consumption during peak periods, and demonstrating adaptability across various building consumption profiles.
能源管理系统的先验信息强化学习
面对日益增长的能源需求和日益严重的环境问题,推广可持续发展和节能做法已势在必行。本文介绍了一种基于强化学习的技术,用于优化能源消耗及其相关成本,重点关注能源管理系统。本文提出了一种三步法,将强化学习与先验知识相结合,对建筑物内储能装置的充电周期进行有效管理。我们采用了一种独特的策略:对建筑物负载曲线进行聚类,以识别典型的能源消耗模式;将领域知识嵌入学习算法,以完善代理的行动空间;预测未来观察结果,以做出实时决策。我们利用真实世界的数据展示了我们方法的有效性。它能够对能源管理系统(EMS)代理进行可控的探索和高效的训练。与基准相比,我们的模型降低了高达 15% 的能源成本,减少了高峰期的消耗,并展示了对各种建筑消耗情况的适应性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信