Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja
{"title":"基于变压器的可持续能源消耗预测,改善社会经济生活:人工智能支持的能源消耗预测","authors":"Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja","doi":"10.1109/MSMC.2023.3334483","DOIUrl":null,"url":null,"abstract":"Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.","PeriodicalId":516814,"journal":{"name":"IEEE Systems, Man, and Cybernetics Magazine","volume":"1105 ","pages":"52-60"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Transformer-Based Forecasting for Sustainable Energy Consumption Toward Improving Socioeconomic Living: AI-Enabled Energy Consumption Forecasting\",\"authors\":\"Gautham Sreekumar, John Paul Martin, S. Raghavan, Christina Terese Joseph, S. P. Raja\",\"doi\":\"10.1109/MSMC.2023.3334483\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.\",\"PeriodicalId\":516814,\"journal\":{\"name\":\"IEEE Systems, Man, and Cybernetics Magazine\",\"volume\":\"1105 \",\"pages\":\"52-60\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Systems, Man, and Cybernetics Magazine\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MSMC.2023.3334483\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Systems, Man, and Cybernetics Magazine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MSMC.2023.3334483","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Transformer-Based Forecasting for Sustainable Energy Consumption Toward Improving Socioeconomic Living: AI-Enabled Energy Consumption Forecasting
Smart energy management encompasses energy consumption prediction and energy data analytics. Energy consumption prediction or electric load forecasting leverages autoregressive and moving-average models. Recently, there has been a lot of traction in data-driven models for energy consumption prediction. In this article, a self-attention-based Transformer model is proposed. The deep-learning model captures long-term dependencies in the data sequence and can be used for long-term prediction. The proposed model is compared with autoregressive integrated moving average (ARIMA) and long short-term memory network (LSTM) models. The different models were applied to the load consumption data from a house located in Sceaux, Paris, France. The prediction windows of 24, 100, and 200 h were considered. To evaluate the performance, the mean absolute prediction error (MAPE) and root-mean-square error (RMSE) were considered as the metrics. The Transformer and LSTM models performed significantly better than ARIMA. Even though Transformer and LSTM performed on par, the load forecasted using Transformer was closer to the previous data in the dataset, which proves the better efficiency of the model. Since the Transformer model has transfer learning ability, it can be used as a pretrained model for training of other time-series datasets and, hence, the model can be potentially applied in other related prediction scenarios also, where time-series data are involved.