G. Ribeiro, Marcos Cesar Gritti, H. V. Ayala, V. Mariani, L. Coelho
{"title":"Short-term load forecasting using wavenet ensemble approaches","authors":"G. Ribeiro, Marcos Cesar Gritti, H. V. Ayala, V. Mariani, L. Coelho","doi":"10.1109/IJCNN.2016.7727272","DOIUrl":null,"url":null,"abstract":"Time series forecasting plays a key role in many areas of science, finance and engineering, mainly for the estimation of trend or seasonality of a variable under observation, aiming to serve as basis for future purchase decisions, choice of design parameters or maintenance schedule. Artificial Neural Networks (ANNs) have proven to be suitable in linear or nonlinear functions mapping. However, the ANNs, implemented in its most simplistic form, tend to have a loss in overall performance. This work aims to obtain a prediction model for a short-term load problem through the usage of wavenets ensemble, which is an ANN approach capable in combining the best characteristics of each ensemble component, in order to achieve a higher overall performance. We adopted the usage of bootstrapping, cross-validation and the inputs decimation approaches for the ensemble construction. For the components selection, `constructive' and `no selection' methods were applied. Finally, the combination is held though simple average, mode or stacked generalization. The results show that it is possible to improve the generalization ability through effective committees depending on the methods used to construct the ensemble. The total relative improvement achieved in respect to the naive model, was over 95%, regardless the number of sub wavenets, and for the best component, the relative improvement was 93.91% using five wavenets. We conclude that the most frequent and effective set, but not always with the lower MSE (Mean Squared Error), was using constructive bagging with simple average.","PeriodicalId":109405,"journal":{"name":"2016 International Joint Conference on Neural Networks (IJCNN)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2016.7727272","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
Time series forecasting plays a key role in many areas of science, finance and engineering, mainly for the estimation of trend or seasonality of a variable under observation, aiming to serve as basis for future purchase decisions, choice of design parameters or maintenance schedule. Artificial Neural Networks (ANNs) have proven to be suitable in linear or nonlinear functions mapping. However, the ANNs, implemented in its most simplistic form, tend to have a loss in overall performance. This work aims to obtain a prediction model for a short-term load problem through the usage of wavenets ensemble, which is an ANN approach capable in combining the best characteristics of each ensemble component, in order to achieve a higher overall performance. We adopted the usage of bootstrapping, cross-validation and the inputs decimation approaches for the ensemble construction. For the components selection, `constructive' and `no selection' methods were applied. Finally, the combination is held though simple average, mode or stacked generalization. The results show that it is possible to improve the generalization ability through effective committees depending on the methods used to construct the ensemble. The total relative improvement achieved in respect to the naive model, was over 95%, regardless the number of sub wavenets, and for the best component, the relative improvement was 93.91% using five wavenets. We conclude that the most frequent and effective set, but not always with the lower MSE (Mean Squared Error), was using constructive bagging with simple average.