Chunyi Hou , Yongchuan Yu , Jinquan Ji , Siyao Zhang , Xumeng Shen , Jianzhuo Yan
{"title":"Graph-patchformer: Patch interaction transformer with adaptive graph learning for multivariate time series forecasting","authors":"Chunyi Hou , Yongchuan Yu , Jinquan Ji , Siyao Zhang , Xumeng Shen , Jianzhuo Yan","doi":"10.1016/j.neunet.2025.108140","DOIUrl":null,"url":null,"abstract":"<div><div>Multivariate time series (MTS) forecasting plays a pivotal role in the digitalization and intelligent development of modern society, while previous MTS forecasting methods based on deep learning often rely on capturing intra-series dependencies for modeling, neglecting the structural information within MTS and failing to consider inter-series local dynamic dependencies. Although some approaches utilize multi-scale representation learning to capture inter-series dynamic dependencies at different time scales, they still require additional multi-scale feature fusion modules to output the multi-scale representation of final forecasting results. In this paper, we propose a novel deep learning framework called Graph-Patchformer, which leverages structural encodings to reflect the structural information within MTS while capturing intra-series dependencies and inter-series local dynamic dependencies using the Patch Interaction Blocks we proposed. Specifically, Graph-Patchformer embeds structural encodings into MTS to reflect the inter-series relationships and temporal variations within the MTS. The embedded data is subsequently fed into the Patch Interaction Blocks through a patching operation. Within the Patch Interaction Blocks, the multi-head self-attention mechanism and adaptive graph learning module are employed to capture intra-series dependencies and inter-series local dynamic dependencies. In this way, Graph-Patchformer not only facilitates interactions between different patches within a single series but also enables cross-time-window interactions between patches of different series. The experimental results show that the Graph-Patchformer outperforms the state-of-the-art approaches and exhitits significant forecasting performance compared to several state-of-the-art methods across various real-world benchmark datasets. The code will be available at this repository: <span><span>https://github.com/houchunyiPhd/Graph-Patchformer/tree/main</span><svg><path></path></svg></span></div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"194 ","pages":"Article 108140"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025010202","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multivariate time series (MTS) forecasting plays a pivotal role in the digitalization and intelligent development of modern society, while previous MTS forecasting methods based on deep learning often rely on capturing intra-series dependencies for modeling, neglecting the structural information within MTS and failing to consider inter-series local dynamic dependencies. Although some approaches utilize multi-scale representation learning to capture inter-series dynamic dependencies at different time scales, they still require additional multi-scale feature fusion modules to output the multi-scale representation of final forecasting results. In this paper, we propose a novel deep learning framework called Graph-Patchformer, which leverages structural encodings to reflect the structural information within MTS while capturing intra-series dependencies and inter-series local dynamic dependencies using the Patch Interaction Blocks we proposed. Specifically, Graph-Patchformer embeds structural encodings into MTS to reflect the inter-series relationships and temporal variations within the MTS. The embedded data is subsequently fed into the Patch Interaction Blocks through a patching operation. Within the Patch Interaction Blocks, the multi-head self-attention mechanism and adaptive graph learning module are employed to capture intra-series dependencies and inter-series local dynamic dependencies. In this way, Graph-Patchformer not only facilitates interactions between different patches within a single series but also enables cross-time-window interactions between patches of different series. The experimental results show that the Graph-Patchformer outperforms the state-of-the-art approaches and exhitits significant forecasting performance compared to several state-of-the-art methods across various real-world benchmark datasets. The code will be available at this repository: https://github.com/houchunyiPhd/Graph-Patchformer/tree/main
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.