Graph-patchformer: Patch interaction transformer with adaptive graph learning for multivariate time series forecasting

IF 6.3 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Chunyi Hou , Yongchuan Yu , Jinquan Ji , Siyao Zhang , Xumeng Shen , Jianzhuo Yan
{"title":"Graph-patchformer: Patch interaction transformer with adaptive graph learning for multivariate time series forecasting","authors":"Chunyi Hou ,&nbsp;Yongchuan Yu ,&nbsp;Jinquan Ji ,&nbsp;Siyao Zhang ,&nbsp;Xumeng Shen ,&nbsp;Jianzhuo Yan","doi":"10.1016/j.neunet.2025.108140","DOIUrl":null,"url":null,"abstract":"<div><div>Multivariate time series (MTS) forecasting plays a pivotal role in the digitalization and intelligent development of modern society, while previous MTS forecasting methods based on deep learning often rely on capturing intra-series dependencies for modeling, neglecting the structural information within MTS and failing to consider inter-series local dynamic dependencies. Although some approaches utilize multi-scale representation learning to capture inter-series dynamic dependencies at different time scales, they still require additional multi-scale feature fusion modules to output the multi-scale representation of final forecasting results. In this paper, we propose a novel deep learning framework called Graph-Patchformer, which leverages structural encodings to reflect the structural information within MTS while capturing intra-series dependencies and inter-series local dynamic dependencies using the Patch Interaction Blocks we proposed. Specifically, Graph-Patchformer embeds structural encodings into MTS to reflect the inter-series relationships and temporal variations within the MTS. The embedded data is subsequently fed into the Patch Interaction Blocks through a patching operation. Within the Patch Interaction Blocks, the multi-head self-attention mechanism and adaptive graph learning module are employed to capture intra-series dependencies and inter-series local dynamic dependencies. In this way, Graph-Patchformer not only facilitates interactions between different patches within a single series but also enables cross-time-window interactions between patches of different series. The experimental results show that the Graph-Patchformer outperforms the state-of-the-art approaches and exhitits significant forecasting performance compared to several state-of-the-art methods across various real-world benchmark datasets. The code will be available at this repository: <span><span>https://github.com/houchunyiPhd/Graph-Patchformer/tree/main</span><svg><path></path></svg></span></div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"194 ","pages":"Article 108140"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025010202","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Multivariate time series (MTS) forecasting plays a pivotal role in the digitalization and intelligent development of modern society, while previous MTS forecasting methods based on deep learning often rely on capturing intra-series dependencies for modeling, neglecting the structural information within MTS and failing to consider inter-series local dynamic dependencies. Although some approaches utilize multi-scale representation learning to capture inter-series dynamic dependencies at different time scales, they still require additional multi-scale feature fusion modules to output the multi-scale representation of final forecasting results. In this paper, we propose a novel deep learning framework called Graph-Patchformer, which leverages structural encodings to reflect the structural information within MTS while capturing intra-series dependencies and inter-series local dynamic dependencies using the Patch Interaction Blocks we proposed. Specifically, Graph-Patchformer embeds structural encodings into MTS to reflect the inter-series relationships and temporal variations within the MTS. The embedded data is subsequently fed into the Patch Interaction Blocks through a patching operation. Within the Patch Interaction Blocks, the multi-head self-attention mechanism and adaptive graph learning module are employed to capture intra-series dependencies and inter-series local dynamic dependencies. In this way, Graph-Patchformer not only facilitates interactions between different patches within a single series but also enables cross-time-window interactions between patches of different series. The experimental results show that the Graph-Patchformer outperforms the state-of-the-art approaches and exhitits significant forecasting performance compared to several state-of-the-art methods across various real-world benchmark datasets. The code will be available at this repository: https://github.com/houchunyiPhd/Graph-Patchformer/tree/main
图贴片器:用于多变量时间序列预测的具有自适应图学习的贴片交互变压器
多元时间序列(MTS)预测在现代社会的数字化和智能化发展中起着至关重要的作用,而以往基于深度学习的MTS预测方法往往依赖于捕获序列内依赖关系进行建模,忽略了MTS内部的结构信息,未能考虑序列间的局部动态依赖关系。尽管一些方法利用多尺度表示学习来捕获不同时间尺度上的序列间动态依赖关系,但它们仍然需要额外的多尺度特征融合模块来输出最终预测结果的多尺度表示。在本文中,我们提出了一种新的深度学习框架,称为Graph-Patchformer,它利用结构编码来反映MTS中的结构信息,同时使用我们提出的Patch交互块捕获序列内依赖关系和序列间局部动态依赖关系。具体来说,Graph-Patchformer将结构编码嵌入到MTS中,以反映MTS内部的序列间关系和时间变化,然后通过补丁操作将嵌入的数据输入到补丁交互块中。在Patch交互块中,采用多头自注意机制和自适应图学习模块捕获序列内依赖关系和序列间局部动态依赖关系。这样,Graph-Patchformer不仅可以促进单个系列内不同补丁之间的交互,还可以实现不同系列补丁之间的跨时间窗口交互。实验结果表明,与几种最先进的方法相比,Graph-Patchformer优于最先进的方法,并在各种真实世界的基准数据集上表现出显著的预测性能。代码可以在这个存储库中获得:https://github.com/houchunyiPhd/Graph-Patchformer/tree/main
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信