基于随机分布嵌入的多步超前预测分组向量自回归油藏计算

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang
{"title":"基于随机分布嵌入的多步超前预测分组向量自回归油藏计算","authors":"Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang","doi":"10.1109/TNNLS.2025.3553060","DOIUrl":null,"url":null,"abstract":"As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"17265-17279"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Grouped Vector Autoregression Reservoir Computing Based on Randomly Distributed Embedding for Multistep-Ahead Prediction\",\"authors\":\"Heshan Wang;Zhepeng Wang;Mingyuan Yu;Jing Liang;Jinzhu Peng;Yaonan Wang\",\"doi\":\"10.1109/TNNLS.2025.3553060\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"17265-17279\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10965819/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10965819/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

储层计算作为一种高效的递归神经网络(RNN),在时间序列预测中得到了广泛的应用。然而,一个解释不清的现象仍然存在,为什么RC和深度RC成功地处理时间序列预测,尽管完全随机化的权重。本研究尝试基于随机分布嵌入(RDE)理论建立分组向量自回归RC (GVARC)时间序列预测模型。在RDE-GVARC中,深层结构是由多个gvarc构建的,这使得已建立的RDE-GVARC演化为具有较少超参数的确定性深度RC模型。然后,将GVARC的空间输出信息映射为基于RDE方程的输出变量的未来时间状态。RDE-GVARC的主要优点可以概括为:1)RDE-GVARC解决了深度rc输入层和隐含层中权矩阵的不确定性和大规模参数选择困难的问题;2) GVARC可以避免深度RC的大规模超参数设计,使深度RC的设计更加直观有效;3) RDE-GVARC在多个混沌序列和真实世界序列中表现出良好的预测性能、较强的稳定性和鲁棒性。仿真结果表明,RDE-GVARC不仅优于目前一些深度RC和rnn,而且在结构可解释的情况下保持了RC的快速性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Grouped Vector Autoregression Reservoir Computing Based on Randomly Distributed Embedding for Multistep-Ahead Prediction
As an efficient recurrent neural network (RNN), reservoir computing (RC) has achieved various applications in time-series forecasting. Nevertheless, a poorly explained phenomenon remains as to why the RC and deep RCs succeed in handling time-series prediction despite completely randomized weights. This study tries to generate a grouped vector autoregressive RC (GVARC) time-series forecasting model based on the randomly distributed embedding (RDE) theory. In RDE-GVARC, the deep structures are constructed by multiple GVARCs, which makes the established RDE-GVARC evolve into a deterministic deep RC model with few hyperparameters. Then, the spatial output information of the GVARC is mapped into the future temporal states of an output variable based on RDE equations. The main advantages of the RDE-GVARC can be summarized as follows: 1) RDE-GVARC solves the problems of uncertainty in the weight matrix and difficulty in large-scale parameter selection in the input and hidden layers of deep RCs; 2) the GVARC can avoid massive deep RC hyperparameter design and make the design of deep RC more straightforward and effective; and 3) the proposed RDE-GVARC shows good performance, strong stability, and robustness in several chaotic and real-world sequences for multistep-ahead prediction. The simulating results confirm that the RDE-GVARC not only outperforms some recently deep RCs and RNNs, but also maintains the rapidity of RC with an interpretable structure.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信