Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang
{"title":"基于随机分布嵌入的多步超前预测分组向量自回归油藏计算","authors":"Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang","doi":"10.1109/TNNLS.2025.3553060","DOIUrl":null,"url":null,"abstract":"As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"17265-17279"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Grouped Vector Autoregression Reservoir Computing Based on Randomly Distributed Embedding for Multistep-Ahead Prediction\",\"authors\":\"Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang\",\"doi\":\"10.1109/TNNLS.2025.3553060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"17265-17279\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10965819/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10965819/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Grouped Vector Autoregression Reservoir Computing Based on Randomly Distributed Embedding for Multistep-Ahead Prediction
As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.