基于变换器的时空图扩散卷积网络用于交通流量预测

Siwei Wei, Yang Yang, Donghua Liu, Ke Deng, Chunzhi Wang
{"title":"基于变换器的时空图扩散卷积网络用于交通流量预测","authors":"Siwei Wei, Yang Yang, Donghua Liu, Ke Deng, Chunzhi Wang","doi":"10.3390/electronics13163151","DOIUrl":null,"url":null,"abstract":"Accurate traffic flow forecasting is a crucial component of intelligent transportation systems, playing a pivotal role in enhancing transportation intelligence. The integration of Graph Neural Networks (GNNs) and Transformers in traffic flow forecasting has gained significant adoption for enhancing prediction accuracy. Yet, the complex spatial and temporal dependencies present in traffic data continue to pose substantial challenges: (1) Most GNN-based methods assume that the graph structure reflects the actual dependencies between nodes, overlooking the complex dependencies present in the real-world context. (2) Standard time-series models are unable to effectively model complex temporal dependencies, hindering prediction accuracy. To tackle these challenges, the authors propose a novel Transformer-based Spatiotemporal Graph Diffusion Convolution Network (TSGDC) for Traffic Flow Forecasting, which leverages graph diffusion and transformer to capture the complexity and dynamics of spatial and temporal patterns, thereby enhancing prediction performance. The authors designed an Efficient Channel Attention (ECA) that learns separately from the feature dimensions collected by traffic sensors and the temporal dimensions of traffic data, aiding in spatiotemporal modeling. Chebyshev Graph Diffusion Convolution (GDC) is used to capture the complex dependencies within the spatial distribution. Sequence decomposition blocks, as internal operations of transformers, are employed to gradually extract long-term stable trends from hidden complex variables. Additionally, by integrating multi-scale dependencies, including recent, daily, and weekly patterns, accurate traffic flow predictions are achieved. Experimental results on various public datasets show that TSGDC outperforms conventional traffic forecasting models, particularly in accuracy and robustness.","PeriodicalId":504598,"journal":{"name":"Electronics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Transformer-Based Spatiotemporal Graph Diffusion Convolution Network for Traffic Flow Forecasting\",\"authors\":\"Siwei Wei, Yang Yang, Donghua Liu, Ke Deng, Chunzhi Wang\",\"doi\":\"10.3390/electronics13163151\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate traffic flow forecasting is a crucial component of intelligent transportation systems, playing a pivotal role in enhancing transportation intelligence. The integration of Graph Neural Networks (GNNs) and Transformers in traffic flow forecasting has gained significant adoption for enhancing prediction accuracy. Yet, the complex spatial and temporal dependencies present in traffic data continue to pose substantial challenges: (1) Most GNN-based methods assume that the graph structure reflects the actual dependencies between nodes, overlooking the complex dependencies present in the real-world context. (2) Standard time-series models are unable to effectively model complex temporal dependencies, hindering prediction accuracy. To tackle these challenges, the authors propose a novel Transformer-based Spatiotemporal Graph Diffusion Convolution Network (TSGDC) for Traffic Flow Forecasting, which leverages graph diffusion and transformer to capture the complexity and dynamics of spatial and temporal patterns, thereby enhancing prediction performance. The authors designed an Efficient Channel Attention (ECA) that learns separately from the feature dimensions collected by traffic sensors and the temporal dimensions of traffic data, aiding in spatiotemporal modeling. Chebyshev Graph Diffusion Convolution (GDC) is used to capture the complex dependencies within the spatial distribution. Sequence decomposition blocks, as internal operations of transformers, are employed to gradually extract long-term stable trends from hidden complex variables. Additionally, by integrating multi-scale dependencies, including recent, daily, and weekly patterns, accurate traffic flow predictions are achieved. Experimental results on various public datasets show that TSGDC outperforms conventional traffic forecasting models, particularly in accuracy and robustness.\",\"PeriodicalId\":504598,\"journal\":{\"name\":\"Electronics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/electronics13163151\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/electronics13163151","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

准确的交通流量预测是智能交通系统的重要组成部分,在提高交通智能化方面发挥着关键作用。图神经网络(GNN)和变换器在交通流预测中的集成已被广泛采用,以提高预测的准确性。然而,交通数据中存在的复杂空间和时间依赖关系仍然构成了巨大挑战:(1)大多数基于图神经网络的方法都假定图结构反映了节点之间的实际依赖关系,从而忽略了现实世界中存在的复杂依赖关系。(2) 标准的时间序列模型无法有效模拟复杂的时间依赖关系,从而影响了预测的准确性。为应对这些挑战,作者提出了一种新颖的基于变换器的时空图扩散卷积网络(TSGDC)用于交通流量预测,该网络利用图扩散和变换器捕捉空间和时间模式的复杂性和动态性,从而提高预测性能。作者设计了一种高效通道注意(ECA),可分别从交通传感器收集的特征维度和交通数据的时间维度进行学习,从而帮助建立时空模型。切比雪夫图扩散卷积(GDC)用于捕捉空间分布中的复杂依赖关系。序列分解块作为变换器的内部操作,用于从隐藏的复杂变量中逐步提取长期稳定趋势。此外,通过整合包括近期、每日和每周模式在内的多尺度依赖关系,实现了精确的交通流量预测。各种公共数据集的实验结果表明,TSGDC 优于传统的交通预测模型,尤其是在准确性和鲁棒性方面。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Transformer-Based Spatiotemporal Graph Diffusion Convolution Network for Traffic Flow Forecasting
Accurate traffic flow forecasting is a crucial component of intelligent transportation systems, playing a pivotal role in enhancing transportation intelligence. The integration of Graph Neural Networks (GNNs) and Transformers in traffic flow forecasting has gained significant adoption for enhancing prediction accuracy. Yet, the complex spatial and temporal dependencies present in traffic data continue to pose substantial challenges: (1) Most GNN-based methods assume that the graph structure reflects the actual dependencies between nodes, overlooking the complex dependencies present in the real-world context. (2) Standard time-series models are unable to effectively model complex temporal dependencies, hindering prediction accuracy. To tackle these challenges, the authors propose a novel Transformer-based Spatiotemporal Graph Diffusion Convolution Network (TSGDC) for Traffic Flow Forecasting, which leverages graph diffusion and transformer to capture the complexity and dynamics of spatial and temporal patterns, thereby enhancing prediction performance. The authors designed an Efficient Channel Attention (ECA) that learns separately from the feature dimensions collected by traffic sensors and the temporal dimensions of traffic data, aiding in spatiotemporal modeling. Chebyshev Graph Diffusion Convolution (GDC) is used to capture the complex dependencies within the spatial distribution. Sequence decomposition blocks, as internal operations of transformers, are employed to gradually extract long-term stable trends from hidden complex variables. Additionally, by integrating multi-scale dependencies, including recent, daily, and weekly patterns, accurate traffic flow predictions are achieved. Experimental results on various public datasets show that TSGDC outperforms conventional traffic forecasting models, particularly in accuracy and robustness.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信