{"title":"A new hybrid prediction model of PM<sub>2.5</sub> concentration based on secondary decomposition and optimized extreme learning machine.","authors":"Hong Yang, Junlin Zhao, Guohui Li","doi":"10.1007/s11356-022-20375-y","DOIUrl":null,"url":null,"abstract":"<p><p>As air pollution worsens, the prediction of PM<sub>2.5</sub> concentration becomes increasingly important for public health. This paper proposes a new hybrid prediction model of PM<sub>2.5</sub> concentration based on complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), amplitude-aware permutation entropy (AAPE), variational mode decomposition improved by marine predators algorithm (MPA-VMD), and extreme learning machine optimized by chimp optimization algorithm (ChOA-ELM), named CEEMDAN-AAPE-MPA-VMD-ChOA-ELM. Firstly, CEEMDAN is used to decompose the original data, and AAPE is used to quantify the complexity of all IMF components. Secondly, MPA-VMD is used to decompose the IMF component with the maximum AAPE. Lastly, ChOA-ELM is used to predict all IMF components, and all prediction results are reconstructed to obtain the final prediction results. The proposed model combines the advantages of secondary decomposition technique, feature analysis, and optimization algorithm, which can predict PM<sub>2.5</sub> concentration accurately. PM<sub>2.5</sub> concentrations at hourly intervals collected from March 1, 2021, to March 31, 2021, in Shanghai and Shenyang, China, are used for experimental study and DM test. The experimental results in Shanghai show that the RMSE, MAE, MAPE, and R<sup>2</sup> of the proposed model are 1.0676, 0.7685, 0.0181, and 0.9980 respectively, which is better than all comparison models at 90% confidence level. In Shenyang, the RMSE, MAE, MAPE, and R<sup>2</sup> of the proposed model are 1.4399, 1.1258, 0.0389, and 0.9976, respectively, which is better than all comparison models at 95% confidence level.</p>","PeriodicalId":16565,"journal":{"name":"东北农业大学学报","volume":"19 1","pages":"67214-67241"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"东北农业大学学报","FirstCategoryId":"93","ListUrlMain":"https://doi.org/10.1007/s11356-022-20375-y","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/5/6 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As air pollution worsens, the prediction of PM2.5 concentration becomes increasingly important for public health. This paper proposes a new hybrid prediction model of PM2.5 concentration based on complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), amplitude-aware permutation entropy (AAPE), variational mode decomposition improved by marine predators algorithm (MPA-VMD), and extreme learning machine optimized by chimp optimization algorithm (ChOA-ELM), named CEEMDAN-AAPE-MPA-VMD-ChOA-ELM. Firstly, CEEMDAN is used to decompose the original data, and AAPE is used to quantify the complexity of all IMF components. Secondly, MPA-VMD is used to decompose the IMF component with the maximum AAPE. Lastly, ChOA-ELM is used to predict all IMF components, and all prediction results are reconstructed to obtain the final prediction results. The proposed model combines the advantages of secondary decomposition technique, feature analysis, and optimization algorithm, which can predict PM2.5 concentration accurately. PM2.5 concentrations at hourly intervals collected from March 1, 2021, to March 31, 2021, in Shanghai and Shenyang, China, are used for experimental study and DM test. The experimental results in Shanghai show that the RMSE, MAE, MAPE, and R2 of the proposed model are 1.0676, 0.7685, 0.0181, and 0.9980 respectively, which is better than all comparison models at 90% confidence level. In Shenyang, the RMSE, MAE, MAPE, and R2 of the proposed model are 1.4399, 1.1258, 0.0389, and 0.9976, respectively, which is better than all comparison models at 95% confidence level.