优化建筑碳能源管理的商业策略:经济学和管理学中的机器学习方法

IF 5.5 3区 材料科学 Q2 CHEMISTRY, MULTIDISCIPLINARY
Hong Zhang, Teeb Basim Abbas, Yousef Zandi, Alireza Sadighi Agdas, Zahra Sadighi Agdas, Meldi Suhatril, Emad Toghroli, Awad A. Ibraheem, Anas A. Salameh, Hakim AL Garalleh, Hamid Assilzadeh
{"title":"优化建筑碳能源管理的商业策略:经济学和管理学中的机器学习方法","authors":"Hong Zhang,&nbsp;Teeb Basim Abbas,&nbsp;Yousef Zandi,&nbsp;Alireza Sadighi Agdas,&nbsp;Zahra Sadighi Agdas,&nbsp;Meldi Suhatril,&nbsp;Emad Toghroli,&nbsp;Awad A. Ibraheem,&nbsp;Anas A. Salameh,&nbsp;Hakim AL Garalleh,&nbsp;Hamid Assilzadeh","doi":"10.1007/s42823-024-00801-6","DOIUrl":null,"url":null,"abstract":"<div><p>Optimizing business strategies for energy through machine learning involves using predictive analytics for accurate energy demand and price forecasting, enhancing operational efficiency through resource optimization and predictive maintenance, and optimizing renewable energy integration into the energy grid. This approach maximizes production, reduces costs, and ensures stability in energy supply. The novelty of integrating deep reinforcement learning (DRL) in energy management lies in its ability to adapt and optimize operational strategies in real-time, autonomously leveraging advanced machine learning techniques to handle dynamic and complex energy environments. The study’s outcomes demonstrate the effectiveness of DRL in optimizing energy management strategies. Statistical validity tests revealed shallow error values [MAE: 1.056 × 10<sup>(−13)</sup> and RMSE: 1.253 × 10<sup>(−13)</sup>], indicating strong predictive accuracy and model robustness. Sensitivity analysis showed that heating and cooling energy consumption variations significantly impact total energy consumption, with predicted changes ranging from 734.66 to 835.46 units. Monte Carlo simulations revealed a mean total energy consumption of 850 units with a standard deviation of 50 units, underscoring the model’s robustness under various stochastic scenarios. Another significant result of the economic impact analysis was the comparison of different operational strategies. The analysis indicated that scenario 1 (high operational costs) and scenario 2 (lower operational costs) both resulted in profits of $70,000, despite differences in operational costs and revenues. However, scenario 3 (optimized strategy) demonstrated superior financial performance with a profit of $78,500. This highlights the importance of strategic operational improvements and suggests that efficiency optimization can significantly enhance profitability. In addition, the DRL-enhanced strategies showed a marked improvement in forecasting and managing demand fluctuations, leading to better resource allocation and reduced energy wastage. Integrating DRL improves operational efficiency and supports long-term financial viability, positioning energy systems for a more sustainable future.</p></div>","PeriodicalId":506,"journal":{"name":"Carbon Letters","volume":"35 2","pages":"607 - 621"},"PeriodicalIF":5.5000,"publicationDate":"2024-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimizing business strategies for carbon energy management in buildings: a machine learning approach in economics and management\",\"authors\":\"Hong Zhang,&nbsp;Teeb Basim Abbas,&nbsp;Yousef Zandi,&nbsp;Alireza Sadighi Agdas,&nbsp;Zahra Sadighi Agdas,&nbsp;Meldi Suhatril,&nbsp;Emad Toghroli,&nbsp;Awad A. Ibraheem,&nbsp;Anas A. Salameh,&nbsp;Hakim AL Garalleh,&nbsp;Hamid Assilzadeh\",\"doi\":\"10.1007/s42823-024-00801-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Optimizing business strategies for energy through machine learning involves using predictive analytics for accurate energy demand and price forecasting, enhancing operational efficiency through resource optimization and predictive maintenance, and optimizing renewable energy integration into the energy grid. This approach maximizes production, reduces costs, and ensures stability in energy supply. The novelty of integrating deep reinforcement learning (DRL) in energy management lies in its ability to adapt and optimize operational strategies in real-time, autonomously leveraging advanced machine learning techniques to handle dynamic and complex energy environments. The study’s outcomes demonstrate the effectiveness of DRL in optimizing energy management strategies. Statistical validity tests revealed shallow error values [MAE: 1.056 × 10<sup>(−13)</sup> and RMSE: 1.253 × 10<sup>(−13)</sup>], indicating strong predictive accuracy and model robustness. Sensitivity analysis showed that heating and cooling energy consumption variations significantly impact total energy consumption, with predicted changes ranging from 734.66 to 835.46 units. Monte Carlo simulations revealed a mean total energy consumption of 850 units with a standard deviation of 50 units, underscoring the model’s robustness under various stochastic scenarios. Another significant result of the economic impact analysis was the comparison of different operational strategies. The analysis indicated that scenario 1 (high operational costs) and scenario 2 (lower operational costs) both resulted in profits of $70,000, despite differences in operational costs and revenues. However, scenario 3 (optimized strategy) demonstrated superior financial performance with a profit of $78,500. This highlights the importance of strategic operational improvements and suggests that efficiency optimization can significantly enhance profitability. In addition, the DRL-enhanced strategies showed a marked improvement in forecasting and managing demand fluctuations, leading to better resource allocation and reduced energy wastage. Integrating DRL improves operational efficiency and supports long-term financial viability, positioning energy systems for a more sustainable future.</p></div>\",\"PeriodicalId\":506,\"journal\":{\"name\":\"Carbon Letters\",\"volume\":\"35 2\",\"pages\":\"607 - 621\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-10-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Carbon Letters\",\"FirstCategoryId\":\"88\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42823-024-00801-6\",\"RegionNum\":3,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Carbon Letters","FirstCategoryId":"88","ListUrlMain":"https://link.springer.com/article/10.1007/s42823-024-00801-6","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

通过机器学习优化能源业务策略包括使用预测分析进行准确的能源需求和价格预测,通过资源优化和预测性维护提高运营效率,以及优化可再生能源与能源网的整合。这种方法可以最大限度地提高产量,降低成本,并确保能源供应的稳定性。将深度强化学习(DRL)集成到能源管理中的新颖之处在于它能够实时适应和优化运营策略,自主利用先进的机器学习技术来处理动态和复杂的能源环境。研究结果证明了DRL在优化能源管理策略方面的有效性。统计效度检验显示误差值较浅[MAE: 1.056 × 10(−13),RMSE: 1.253 × 10(−13)],表明预测精度高,模型稳健性强。敏感性分析表明,供暖和制冷能耗变化对总能耗影响显著,预测变化范围为734.66 ~ 835.46个单位。蒙特卡罗模拟显示,平均总能耗为850个单位,标准差为50个单位,强调了模型在各种随机情景下的鲁棒性。经济影响分析的另一个重要结果是对不同业务战略的比较。分析表明,尽管业务成本和收入存在差异,但情景1(高业务成本)和情景2(低业务成本)都产生了7万美元的利润。然而,场景3(优化策略)表现出优异的财务业绩,利润为78,500美元。这突出了战略运营改进的重要性,并表明效率优化可以显著提高盈利能力。此外,加强drl的战略在预测和管理需求波动方面有了显著改善,从而改善了资源分配,减少了能源浪费。整合DRL可以提高运营效率,支持长期财务可行性,为更可持续的未来定位能源系统。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Optimizing business strategies for carbon energy management in buildings: a machine learning approach in economics and management

Optimizing business strategies for carbon energy management in buildings: a machine learning approach in economics and management

Optimizing business strategies for energy through machine learning involves using predictive analytics for accurate energy demand and price forecasting, enhancing operational efficiency through resource optimization and predictive maintenance, and optimizing renewable energy integration into the energy grid. This approach maximizes production, reduces costs, and ensures stability in energy supply. The novelty of integrating deep reinforcement learning (DRL) in energy management lies in its ability to adapt and optimize operational strategies in real-time, autonomously leveraging advanced machine learning techniques to handle dynamic and complex energy environments. The study’s outcomes demonstrate the effectiveness of DRL in optimizing energy management strategies. Statistical validity tests revealed shallow error values [MAE: 1.056 × 10(−13) and RMSE: 1.253 × 10(−13)], indicating strong predictive accuracy and model robustness. Sensitivity analysis showed that heating and cooling energy consumption variations significantly impact total energy consumption, with predicted changes ranging from 734.66 to 835.46 units. Monte Carlo simulations revealed a mean total energy consumption of 850 units with a standard deviation of 50 units, underscoring the model’s robustness under various stochastic scenarios. Another significant result of the economic impact analysis was the comparison of different operational strategies. The analysis indicated that scenario 1 (high operational costs) and scenario 2 (lower operational costs) both resulted in profits of $70,000, despite differences in operational costs and revenues. However, scenario 3 (optimized strategy) demonstrated superior financial performance with a profit of $78,500. This highlights the importance of strategic operational improvements and suggests that efficiency optimization can significantly enhance profitability. In addition, the DRL-enhanced strategies showed a marked improvement in forecasting and managing demand fluctuations, leading to better resource allocation and reduced energy wastage. Integrating DRL improves operational efficiency and supports long-term financial viability, positioning energy systems for a more sustainable future.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Carbon Letters
Carbon Letters CHEMISTRY, MULTIDISCIPLINARY-MATERIALS SCIENCE, MULTIDISCIPLINARY
CiteScore
7.30
自引率
20.00%
发文量
118
期刊介绍: Carbon Letters aims to be a comprehensive journal with complete coverage of carbon materials and carbon-rich molecules. These materials range from, but are not limited to, diamond and graphite through chars, semicokes, mesophase substances, carbon fibers, carbon nanotubes, graphenes, carbon blacks, activated carbons, pyrolytic carbons, glass-like carbons, etc. Papers on the secondary production of new carbon and composite materials from the above mentioned various carbons are within the scope of the journal. Papers on organic substances, including coals, will be considered only if the research has close relation to the resulting carbon materials. Carbon Letters also seeks to keep abreast of new developments in their specialist fields and to unite in finding alternative energy solutions to current issues such as the greenhouse effect and the depletion of the ozone layer. The renewable energy basics, energy storage and conversion, solar energy, wind energy, water energy, nuclear energy, biomass energy, hydrogen production technology, and other clean energy technologies are also within the scope of the journal. Carbon Letters invites original reports of fundamental research in all branches of the theory and practice of carbon science and technology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信