基于变压器的电力负荷预测方法

Q1 Social Sciences
Jun Wei Chan , Chai Kiat Yeo
{"title":"基于变压器的电力负荷预测方法","authors":"Jun Wei Chan ,&nbsp;Chai Kiat Yeo","doi":"10.1016/j.tej.2024.107370","DOIUrl":null,"url":null,"abstract":"<div><p>In natural language processing (NLP), transformer based models have surpassed recurrent neural networks (RNN) as state of the art, being introduced specifically to address the limitations of RNNs originating from its sequential nature. As a similar sequence modeling problem, transformer methods can be readily adapted for deep learning time series prediction. This paper proposes a sparse transformer based approach for electricity load prediction. The layers of a transformer addresses the shortcomings of RNNs and CNNs by applying the attention mechanism on the entire time series, allowing any data point in the input to influence any location in the output of the layer. This allows transformers to incorporate information from the entire sequence in a single layer. Attention computations can also be parallelized. Thus, transformers can achieve faster speeds, or trade this speed for more layers and increased complexity. In experiments on public datasets, the sparse transformer attained comparable accuracy to an RNN-based SOTA method (Liu et al., 2022) while being up to 5× faster during inference. Moreover, the proposed model is general enough to forecast the load from individual households to city levels as shown in the extensive experiments conducted.</p></div>","PeriodicalId":35642,"journal":{"name":"Electricity Journal","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Transformer based approach to electricity load forecasting\",\"authors\":\"Jun Wei Chan ,&nbsp;Chai Kiat Yeo\",\"doi\":\"10.1016/j.tej.2024.107370\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In natural language processing (NLP), transformer based models have surpassed recurrent neural networks (RNN) as state of the art, being introduced specifically to address the limitations of RNNs originating from its sequential nature. As a similar sequence modeling problem, transformer methods can be readily adapted for deep learning time series prediction. This paper proposes a sparse transformer based approach for electricity load prediction. The layers of a transformer addresses the shortcomings of RNNs and CNNs by applying the attention mechanism on the entire time series, allowing any data point in the input to influence any location in the output of the layer. This allows transformers to incorporate information from the entire sequence in a single layer. Attention computations can also be parallelized. Thus, transformers can achieve faster speeds, or trade this speed for more layers and increased complexity. In experiments on public datasets, the sparse transformer attained comparable accuracy to an RNN-based SOTA method (Liu et al., 2022) while being up to 5× faster during inference. Moreover, the proposed model is general enough to forecast the load from individual households to city levels as shown in the extensive experiments conducted.</p></div>\",\"PeriodicalId\":35642,\"journal\":{\"name\":\"Electricity Journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-02-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electricity Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1040619024000058\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electricity Journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1040619024000058","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

摘要

在自然语言处理(NLP)领域,基于变换器的模型已经超越了递归神经网络(RNN),成为最先进的技术。作为一个类似的序列建模问题,变换器方法可以很容易地应用于深度学习时间序列预测。本文提出了一种基于稀疏变换器的电力负荷预测方法。变换器的各层解决了 RNN 和 CNN 的缺点,将注意力机制应用于整个时间序列,允许输入中的任何数据点影响层输出中的任何位置。这使得变换器可以在单层中纳入整个序列的信息。注意力计算也可以并行化。因此,变换器可以实现更快的速度,也可以用更多的层数和更高的复杂度来换取更快的速度。在对公共数据集的实验中,稀疏变换器的准确度与基于 RNN 的 SOTA 方法(Liu 等人,2022 年)相当,而推理速度则快达 5 倍。此外,正如所进行的大量实验所显示的那样,所提出的模型具有足够的通用性,可以预测从单个家庭到城市级别的负荷。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Transformer based approach to electricity load forecasting

In natural language processing (NLP), transformer based models have surpassed recurrent neural networks (RNN) as state of the art, being introduced specifically to address the limitations of RNNs originating from its sequential nature. As a similar sequence modeling problem, transformer methods can be readily adapted for deep learning time series prediction. This paper proposes a sparse transformer based approach for electricity load prediction. The layers of a transformer addresses the shortcomings of RNNs and CNNs by applying the attention mechanism on the entire time series, allowing any data point in the input to influence any location in the output of the layer. This allows transformers to incorporate information from the entire sequence in a single layer. Attention computations can also be parallelized. Thus, transformers can achieve faster speeds, or trade this speed for more layers and increased complexity. In experiments on public datasets, the sparse transformer attained comparable accuracy to an RNN-based SOTA method (Liu et al., 2022) while being up to 5× faster during inference. Moreover, the proposed model is general enough to forecast the load from individual households to city levels as shown in the extensive experiments conducted.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Electricity Journal
Electricity Journal Business, Management and Accounting-Business and International Management
CiteScore
5.80
自引率
0.00%
发文量
95
审稿时长
31 days
期刊介绍: The Electricity Journal is the leading journal in electric power policy. The journal deals primarily with fuel diversity and the energy mix needed for optimal energy market performance, and therefore covers the full spectrum of energy, from coal, nuclear, natural gas and oil, to renewable energy sources including hydro, solar, geothermal and wind power. Recently, the journal has been publishing in emerging areas including energy storage, microgrid strategies, dynamic pricing, cyber security, climate change, cap and trade, distributed generation, net metering, transmission and generation market dynamics. The Electricity Journal aims to bring together the most thoughtful and influential thinkers globally from across industry, practitioners, government, policymakers and academia. The Editorial Advisory Board is comprised of electric industry thought leaders who have served as regulators, consultants, litigators, and market advocates. Their collective experience helps ensure that the most relevant and thought-provoking issues are presented to our readers, and helps navigate the emerging shape and design of the electricity/energy industry.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信