{"title":"关于记忆增强型门控递归单元网络","authors":"Maolin Yang, Muyi Li, Guodong Li","doi":"10.1016/j.ijforecast.2024.07.008","DOIUrl":null,"url":null,"abstract":"This paper addresses the challenge of forecasting multivariate long-memory time series. While statistical models such as the autoregressive fractionally integrated moving average (ARFIMA) and hyperbolic generalized autoregressive conditional heteroscedasticity (HYGARCH) can capture long-memory effects in time series data, they are often limited by dimensionality and parametric specification. Alternatively, recurrent neural networks (RNNs) are popular tools for approximating complex structures in sequential data. However, the lack of long-memory effect of these networks has been justified from a statistical perspective. In this paper, we propose a new network process called the memory-augmented gated recurrent unit (MGRU), which incorporates a fractionally integrated filter into the original GRU structure. We investigate the long-memory effect of the MGRU process, and demonstrate its effectiveness at capturing long-range dependence in real applications. Our findings illustrate that the proposed MGRU network outperforms existing models, indicating its potential as a promising tool for long-memory time series forecasting.","PeriodicalId":14061,"journal":{"name":"International Journal of Forecasting","volume":"74 1","pages":""},"PeriodicalIF":6.9000,"publicationDate":"2024-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On memory-augmented gated recurrent unit network\",\"authors\":\"Maolin Yang, Muyi Li, Guodong Li\",\"doi\":\"10.1016/j.ijforecast.2024.07.008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper addresses the challenge of forecasting multivariate long-memory time series. While statistical models such as the autoregressive fractionally integrated moving average (ARFIMA) and hyperbolic generalized autoregressive conditional heteroscedasticity (HYGARCH) can capture long-memory effects in time series data, they are often limited by dimensionality and parametric specification. Alternatively, recurrent neural networks (RNNs) are popular tools for approximating complex structures in sequential data. However, the lack of long-memory effect of these networks has been justified from a statistical perspective. In this paper, we propose a new network process called the memory-augmented gated recurrent unit (MGRU), which incorporates a fractionally integrated filter into the original GRU structure. We investigate the long-memory effect of the MGRU process, and demonstrate its effectiveness at capturing long-range dependence in real applications. Our findings illustrate that the proposed MGRU network outperforms existing models, indicating its potential as a promising tool for long-memory time series forecasting.\",\"PeriodicalId\":14061,\"journal\":{\"name\":\"International Journal of Forecasting\",\"volume\":\"74 1\",\"pages\":\"\"},\"PeriodicalIF\":6.9000,\"publicationDate\":\"2024-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Forecasting\",\"FirstCategoryId\":\"96\",\"ListUrlMain\":\"https://doi.org/10.1016/j.ijforecast.2024.07.008\",\"RegionNum\":2,\"RegionCategory\":\"经济学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Forecasting","FirstCategoryId":"96","ListUrlMain":"https://doi.org/10.1016/j.ijforecast.2024.07.008","RegionNum":2,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECONOMICS","Score":null,"Total":0}
This paper addresses the challenge of forecasting multivariate long-memory time series. While statistical models such as the autoregressive fractionally integrated moving average (ARFIMA) and hyperbolic generalized autoregressive conditional heteroscedasticity (HYGARCH) can capture long-memory effects in time series data, they are often limited by dimensionality and parametric specification. Alternatively, recurrent neural networks (RNNs) are popular tools for approximating complex structures in sequential data. However, the lack of long-memory effect of these networks has been justified from a statistical perspective. In this paper, we propose a new network process called the memory-augmented gated recurrent unit (MGRU), which incorporates a fractionally integrated filter into the original GRU structure. We investigate the long-memory effect of the MGRU process, and demonstrate its effectiveness at capturing long-range dependence in real applications. Our findings illustrate that the proposed MGRU network outperforms existing models, indicating its potential as a promising tool for long-memory time series forecasting.
期刊介绍:
The International Journal of Forecasting is a leading journal in its field that publishes high quality refereed papers. It aims to bridge the gap between theory and practice, making forecasting useful and relevant for decision and policy makers. The journal places strong emphasis on empirical studies, evaluation activities, implementation research, and improving the practice of forecasting. It welcomes various points of view and encourages debate to find solutions to field-related problems. The journal is the official publication of the International Institute of Forecasters (IIF) and is indexed in Sociological Abstracts, Journal of Economic Literature, Statistical Theory and Method Abstracts, INSPEC, Current Contents, UMI Data Courier, RePEc, Academic Journal Guide, CIS, IAOR, and Social Sciences Citation Index.