Dynamic meta-graph convolutional recurrent network for heterogeneous spatiotemporal graph forecasting

IF 6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Xianwei Guo , Zhiyong Yu , Fangwan Huang , Xing Chen , Dingqi Yang , Jiangtao Wang
{"title":"Dynamic meta-graph convolutional recurrent network for heterogeneous spatiotemporal graph forecasting","authors":"Xianwei Guo ,&nbsp;Zhiyong Yu ,&nbsp;Fangwan Huang ,&nbsp;Xing Chen ,&nbsp;Dingqi Yang ,&nbsp;Jiangtao Wang","doi":"10.1016/j.neunet.2024.106805","DOIUrl":null,"url":null,"abstract":"<div><div>Spatiotemporal Graph (STG) forecasting is an essential task within the realm of spatiotemporal data mining and urban computing. Over the past few years, Spatiotemporal Graph Neural Networks (STGNNs) have gained significant attention as promising solutions for STG forecasting. However, existing methods often overlook two issues: the dynamic spatial dependencies of urban networks and the heterogeneity of urban spatiotemporal data. In this paper, we propose a novel framework for STG learning called Dynamic Meta-Graph Convolutional Recurrent Network (DMetaGCRN), which effectively tackles both challenges. Specifically, we first build a meta-graph generator to dynamically generate graph structures, which integrates various dynamic features, including input sensor signals and their historical trends, periodic information (timestamp embeddings), and meta-node embeddings. Among them, a memory network is used to guide the learning of meta-node embeddings. The meta-graph generation process enables the model to simulate the dynamic spatial dependencies of urban networks and capture data heterogeneity. Then, we design a Dynamic Meta-Graph Convolutional Recurrent Unit (DMetaGCRU) to simultaneously model spatial and temporal dependencies. Finally, we formulate the proposed DMetaGCRN in an encoder–decoder architecture built upon DMetaGCRU and meta-graph generator components. Extensive experiments on four real-world urban spatiotemporal datasets validate that the proposed DMetaGCRN framework outperforms state-of-the-art approaches.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106805"},"PeriodicalIF":6.0000,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007299","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Spatiotemporal Graph (STG) forecasting is an essential task within the realm of spatiotemporal data mining and urban computing. Over the past few years, Spatiotemporal Graph Neural Networks (STGNNs) have gained significant attention as promising solutions for STG forecasting. However, existing methods often overlook two issues: the dynamic spatial dependencies of urban networks and the heterogeneity of urban spatiotemporal data. In this paper, we propose a novel framework for STG learning called Dynamic Meta-Graph Convolutional Recurrent Network (DMetaGCRN), which effectively tackles both challenges. Specifically, we first build a meta-graph generator to dynamically generate graph structures, which integrates various dynamic features, including input sensor signals and their historical trends, periodic information (timestamp embeddings), and meta-node embeddings. Among them, a memory network is used to guide the learning of meta-node embeddings. The meta-graph generation process enables the model to simulate the dynamic spatial dependencies of urban networks and capture data heterogeneity. Then, we design a Dynamic Meta-Graph Convolutional Recurrent Unit (DMetaGCRU) to simultaneously model spatial and temporal dependencies. Finally, we formulate the proposed DMetaGCRN in an encoder–decoder architecture built upon DMetaGCRU and meta-graph generator components. Extensive experiments on four real-world urban spatiotemporal datasets validate that the proposed DMetaGCRN framework outperforms state-of-the-art approaches.
用于异构时空图预测的动态元图卷积递归网络
时空图(STG)预测是时空数据挖掘和城市计算领域的一项重要任务。在过去几年中,时空图神经网络(STGNNs)作为有前途的 STG 预测解决方案受到了广泛关注。然而,现有方法往往忽略了两个问题:城市网络的动态空间依赖性和城市时空数据的异质性。在本文中,我们提出了一种新的 STG 学习框架,称为动态元图卷积递归网络(DMetaGCRN),它能有效地解决这两个难题。具体来说,我们首先构建了一个元图生成器来动态生成图结构,该生成器集成了各种动态特征,包括输入传感器信号及其历史趋势、周期信息(时间戳嵌入)和元节点嵌入。其中,记忆网络用于指导元节点嵌入的学习。元图生成过程可使模型模拟城市网络的动态空间依赖关系,并捕捉数据的异质性。然后,我们设计了一个动态元图卷积递归单元(DMetaGCRU),以同时模拟空间和时间依赖关系。最后,我们在基于 DMetaGCRU 和元图生成器组件的编码器-解码器架构中制定了拟议的 DMetaGCRN。在四个真实世界的城市时空数据集上进行的广泛实验验证了所提出的 DMetaGCRN 框架优于最先进的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信