Congbing He , Zhenhong Jia , Jie Hu , Fei Shi , Xiaohui Huang
{"title":"CDTDNet: A neural network for capturing deep temporal dependencies in time series","authors":"Congbing He , Zhenhong Jia , Jie Hu , Fei Shi , Xiaohui Huang","doi":"10.1016/j.ins.2025.121995","DOIUrl":null,"url":null,"abstract":"<div><div>The current research in time series forecasting is still deficient in extracting the time dependencies in depth. For this reason, a novel deep learning framework is proposed in this paper to extract deep temporal dependencies from time series data, and effectively feature-fuse temporal dependencies with other time series features. The Cell State Capture Recurrent Unit is used as a novel recurrent neural network together with Temporal Convolutional Network to capture the deep temporal dependencies of the data. Historical statistical information is constructed to introduce linear correlation variables for the model. Novel temporal attention coordinates the importance of time series time steps. Coupled attention improves the decoder's ability to interpret the encoded information. Finally, the AutoEncoder is employed as a prediction calibrator to improve the accuracy and robustness of the network. Comparisons with baseline methods and state-of-the-art strategies on datasets from four different domains confirm the effectiveness as well as the robustness of the proposed predictive network. In addition, the Cell State Capture Recurrent Unit can be considered a benchmark for time series forecasting instead of being limited to the Long and Short-Term Memory or Gated Recurrent Unit.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"706 ","pages":"Article 121995"},"PeriodicalIF":8.1000,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025525001276","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The current research in time series forecasting is still deficient in extracting the time dependencies in depth. For this reason, a novel deep learning framework is proposed in this paper to extract deep temporal dependencies from time series data, and effectively feature-fuse temporal dependencies with other time series features. The Cell State Capture Recurrent Unit is used as a novel recurrent neural network together with Temporal Convolutional Network to capture the deep temporal dependencies of the data. Historical statistical information is constructed to introduce linear correlation variables for the model. Novel temporal attention coordinates the importance of time series time steps. Coupled attention improves the decoder's ability to interpret the encoded information. Finally, the AutoEncoder is employed as a prediction calibrator to improve the accuracy and robustness of the network. Comparisons with baseline methods and state-of-the-art strategies on datasets from four different domains confirm the effectiveness as well as the robustness of the proposed predictive network. In addition, the Cell State Capture Recurrent Unit can be considered a benchmark for time series forecasting instead of being limited to the Long and Short-Term Memory or Gated Recurrent Unit.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.