Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang
{"title":"Improved Attention-based AE-LSTM for Short-term Power Load Forecasting","authors":"Li Qian, Zhao Yuelei, Zheng Haiqing, Sun Xiaoyun, Han Guang, Zhang Jun, Liu Baoan, Guo Kang","doi":"10.1109/YAC57282.2022.10023759","DOIUrl":null,"url":null,"abstract":"Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.","PeriodicalId":272227,"journal":{"name":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 37th Youth Academic Annual Conference of Chinese Association of Automation (YAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/YAC57282.2022.10023759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Power load forecasting, which can increase system stability, is a more crucial link in the planning of the power system and the construction of sectors associated to power. Random feature changes in short-term load forecasting make the forecasting process more difficult. A more advanced attention-based AE-LSTM (Auto Encoder-Long Short-Term Memory) is a multivariate prediction model that aims to address the drawbacks of feature extraction of various information in numerous time steps and the substantial amount of computing in the training process. AE can obtain different kinds of temporal features to obtain related feature vectors. Since AE-LSTM has limitations in paying dynamic degrees of attention to sub-window characteristics over multi-time step. Through the idea of competitive random search, an improved attention learning algorithm is introduced to optimize AE-LSTM. Using attention sampling to mine relevant information in temporal features, and striving for random search, the model can reduce the phenomenon of local optimization. The experimental results verified the effectiveness of the proposed prediction model.