基于变压器的可持续能源消耗预测,改善社会经济生活:人工智能支持的能源消耗预测

Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja
{"title":"基于变压器的可持续能源消耗预测,改善社会经济生活:人工智能支持的能源消耗预测","authors":"Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja","doi":"10.1109/MSMC.2023.3334483","DOIUrl":null,"url":null,"abstract":"Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.","PeriodicalId":516814,"journal":{"name":"IEEE Systems, Man, and Cybernetics Magazine","volume":"1105 ","pages":"52-60"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Transformer-Based Forecasting for Sustainable Energy Consumption Toward Improving Socioeconomic Living: AI-Enabled Energy Consumption Forecasting\",\"authors\":\"Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja\",\"doi\":\"10.1109/MSMC.2023.3334483\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.\",\"PeriodicalId\":516814,\"journal\":{\"name\":\"IEEE Systems, Man, and Cybernetics Magazine\",\"volume\":\"1105 \",\"pages\":\"52-60\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Systems, Man, and Cybernetics Magazine\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MSMC.2023.3334483\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Systems, Man, and Cybernetics Magazine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MSMC.2023.3334483","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

智能能源管理包括能耗预测和能源数据分析。能耗预测或电力负荷预测利用的是自回归和移动平均模型。最近,用于能耗预测的数据驱动模型受到了广泛关注。本文提出了一种基于自我关注的变压器模型。该深度学习模型捕捉了数据序列中的长期依赖关系,可用于长期预测。本文将提出的模型与自回归综合移动平均(ARIMA)模型和长短期记忆网络(LSTM)模型进行了比较。不同的模型被应用于法国巴黎斯索(Sceaux)一栋房屋的负荷消耗数据。预测窗口分别为 24、100 和 200 小时。为了评估性能,平均绝对预测误差 (MAPE) 和均方根误差 (RMSE) 被视为衡量指标。Transformer 和 LSTM 模型的性能明显优于 ARIMA 模型。尽管 Transformer 和 LSTM 的表现不相上下,但使用 Transformer 预测的负荷更接近数据集中之前的数据,这证明该模型的效率更高。由于 Transformer 模型具有迁移学习能力,它可以作为预训练模型用于其他时间序列数据集的训练,因此,该模型也有可能应用于其他涉及时间序列数据的相关预测场景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Transformer-Based Forecasting for Sustainable Energy Consumption Toward Improving Socioeconomic Living: AI-Enabled Energy Consumption Forecasting
Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信