Yang Yang , Yuchao Gao , Hu Zhou , Jinran Wu , Shangce Gao , You-Gan Wang
{"title":"用于长期确定性和概率性电力负荷预测的多粒度自变换器","authors":"Yang Yang , Yuchao Gao , Hu Zhou , Jinran Wu , Shangce Gao , You-Gan Wang","doi":"10.1016/j.neunet.2025.107493","DOIUrl":null,"url":null,"abstract":"<div><div>Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107493"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting\",\"authors\":\"Yang Yang , Yuchao Gao , Hu Zhou , Jinran Wu , Shangce Gao , You-Gan Wang\",\"doi\":\"10.1016/j.neunet.2025.107493\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"188 \",\"pages\":\"Article 107493\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-04-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025003727\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003727","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Multi-Granularity Autoformer for long-term deterministic and probabilistic power load forecasting
Long-term power load forecasting is critical for power system planning but is constrained by intricate temporal patterns. Transformer-based models emphasize modeling long- and short-term dependencies yet encounter limitations from complexity and parameter overhead. This paper introduces a novel Multi-Granularity Autoformer (MG-Autoformer) for long-term load forecasting. The model leverages a Multi-Granularity Auto-Correlation Attention Mechanism (MG-ACAM) to effectively capture fine-grained and coarse-grained temporal dependencies, enabling accurate modeling of short-term fluctuations and long-term trends. To enhance efficiency, a shared query–key (Q–K) mechanism is utilized to identify key temporal patterns across multiple resolutions and reduce model complexity. To address uncertainty in power load forecasting, the model incorporates a quantile loss function, enabling probabilistic predictions while quantifying uncertainty. Extensive experiments on benchmark datasets from Portugal, Australia, America, and ISO New England demonstrate the superior performance of the proposed MG-Autoformer in long-term power load point and probabilistic forecasting tasks.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.