Transformer training strategies for forecasting multiple load time series

Q2 Energy
Matthias Hertel, Maximilian Beichter, Benedikt Heidrich, Oliver Neumann, Benjamin Schäfer, Ralf Mikut, Veit Hagenmeyer
{"title":"Transformer training strategies for forecasting multiple load time series","authors":"Matthias Hertel,&nbsp;Maximilian Beichter,&nbsp;Benedikt Heidrich,&nbsp;Oliver Neumann,&nbsp;Benjamin Schäfer,&nbsp;Ralf Mikut,&nbsp;Veit Hagenmeyer","doi":"10.1186/s42162-023-00278-z","DOIUrl":null,"url":null,"abstract":"<div><p>In the smart grid of the future, accurate load forecasts on the level of individual clients can help to balance supply and demand locally and to prevent grid outages. While the number of monitored clients will increase with the ongoing smart meter rollout, the amount of data per client will always be limited. We evaluate whether a Transformer load forecasting model benefits from a transfer learning strategy, where a global univariate model is trained on the load time series from multiple clients. In experiments with two datasets containing load time series from several hundred clients, we find that the global training strategy is superior to the multivariate and local training strategies used in related work. On average, the global training strategy results in 21.8% and 12.8% lower forecasting errors than the two other strategies, measured across forecasting horizons from one day to one month into the future. A comparison to linear models, multi-layer perceptrons and LSTMs shows that Transformers are effective for load forecasting when they are trained with the global training strategy.</p></div>","PeriodicalId":538,"journal":{"name":"Energy Informatics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://energyinformatics.springeropen.com/counter/pdf/10.1186/s42162-023-00278-z","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy Informatics","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1186/s42162-023-00278-z","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Energy","Score":null,"Total":0}
引用次数: 0

Abstract

In the smart grid of the future, accurate load forecasts on the level of individual clients can help to balance supply and demand locally and to prevent grid outages. While the number of monitored clients will increase with the ongoing smart meter rollout, the amount of data per client will always be limited. We evaluate whether a Transformer load forecasting model benefits from a transfer learning strategy, where a global univariate model is trained on the load time series from multiple clients. In experiments with two datasets containing load time series from several hundred clients, we find that the global training strategy is superior to the multivariate and local training strategies used in related work. On average, the global training strategy results in 21.8% and 12.8% lower forecasting errors than the two other strategies, measured across forecasting horizons from one day to one month into the future. A comparison to linear models, multi-layer perceptrons and LSTMs shows that Transformers are effective for load forecasting when they are trained with the global training strategy.

预测多负荷时间序列的变压器训练策略
在未来的智能电网中,对单个客户的准确负荷预测有助于平衡本地供需,防止电网中断。虽然随着智能电表的不断推出,受监控客户端的数量将增加,但每个客户端的数据量始终是有限的。我们评估变压器负荷预测模型是否受益于迁移学习策略,在迁移学习策略中,全局单变量模型是在多个客户端的负荷时间序列上训练的。在两个包含来自数百个客户端的负载时间序列的数据集的实验中,我们发现全局训练策略优于相关工作中使用的多变量和局部训练策略。平均而言,全球训练策略的预测误差比其他两种策略分别低21.8%和12.8%,这两种策略是在未来一天到一个月的预测范围内测量的。与线性模型、多层感知器和LSTM的比较表明,当使用全局训练策略对变压器进行训练时,变压器对于负荷预测是有效的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Energy Informatics
Energy Informatics Computer Science-Computer Networks and Communications
CiteScore
5.50
自引率
0.00%
发文量
34
审稿时长
5 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信