Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang
{"title":"改进的基于注意力的AE-LSTM短期电力负荷预测","authors":"Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang","doi":"10.1109/YAC57282.2022.10023759","DOIUrl":null,"url":null,"abstract":"Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.","PeriodicalId":272227,"journal":{"name":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improved Attention-based AE-LSTM for Short-term Power Load Forecasting\",\"authors\":\"Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang\",\"doi\":\"10.1109/YAC57282.2022.10023759\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.\",\"PeriodicalId\":272227,\"journal\":{\"name\":\"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/YAC57282.2022.10023759\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/YAC57282.2022.10023759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
电力负荷预测是电力系统规划和电力相关部门建设中更为关键的环节,可以提高系统的稳定性。短期负荷预测的随机特征变化使预测过程更加困难。更高级的基于注意的AE-LSTM (Auto Encoder-Long - short - Memory)是一种多变量预测模型,旨在解决训练过程中各种信息的特征提取需要大量时间步长和计算量大的缺点。声发射可以获取不同种类的时间特征,从而获得相关的特征向量。由于AE-LSTM在多时间步长对子窗口特征的动态关注程度方面存在局限性。通过竞争随机搜索的思想,引入一种改进的注意学习算法来优化AE-LSTM。该模型利用注意力抽样挖掘时间特征中的相关信息,并力求随机搜索,减少了局部寻优现象。实验结果验证了该预测模型的有效性。
Improved Attention-based AE-LSTM for Short-term Power Load Forecasting
Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.