Improved Attention-based AE-LSTM for Short-term Power Load Forecasting

Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang
{"title":"Improved Attention-based AE-LSTM for Short-term Power Load Forecasting","authors":"Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang","doi":"10.1109/YAC57282.2022.10023759","DOIUrl":null,"url":null,"abstract":"Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.","PeriodicalId":272227,"journal":{"name":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/YAC57282.2022.10023759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.
改进的基于注意力的AE-LSTM短期电力负荷预测
电力负荷预测是电力系统规划和电力相关部门建设中更为关键的环节,可以提高系统的稳定性。短期负荷预测的随机特征变化使预测过程更加困难。更高级的基于注意的AE-LSTM (Auto Encoder-Long - short - Memory)是一种多变量预测模型,旨在解决训练过程中各种信息的特征提取需要大量时间步长和计算量大的缺点。声发射可以获取不同种类的时间特征,从而获得相关的特征向量。由于AE-LSTM在多时间步长对子窗口特征的动态关注程度方面存在局限性。通过竞争随机搜索的思想,引入一种改进的注意学习算法来优化AE-LSTM。该模型利用注意力抽样挖掘时间特征中的相关信息,并力求随机搜索,减少了局部寻优现象。实验结果验证了该预测模型的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信