{"title":"Routeformer:Transformer utilizing routing mechanism for traffic flow forecasting","authors":"Jun Qi, Hong Fan","doi":"10.1016/j.neucom.2025.129753","DOIUrl":null,"url":null,"abstract":"<div><div>Traffic flow prediction is vital for the development of intelligent transportation systems. The challenge lies in accurately capturing the complex and dynamic spatiotemporal dependencies influenced by real road network fluctuations. These dependencies can be simplified into three categories: (i) spatial dependencies among sensors at the same timestamp, (ii) temporal dependencies of the same sensor at different timestamps, and (iii) cross dimensional dependencies between different sensors at different timestamps. The third type of cross dimensional dependency requires considering the relationships between different sensors across multiple time points, which is not only complex but also difficult to capture accurately. Existing methods often describe it indirectly by merging spatiotemporal dependencies, but this approach is frequently insufficiently accurate. We aim to characterize this relationship more precisely by capturing the sequential dependencies among sensors, referred to as inter-series dependencies. Capturing inter-series dependencies does not require directly modeling the relationships between different sensors across multiple time points; rather, it focuses on the dependencies between the temporal patterns of different sensors. Our designed Temporal Routing Transformer captures temporal dependencies along the temporal axis while implicitly modeling the inter-series dependencies between sensors. At the same time, we capture spatial dependencies through the Spatial Routing Transformer and multi-scale temporal dependencies by using the Context-Aware Transformer. A series of evaluations were conducted on seven real world datasets, and Routeformer achieved state-of-the-art performance.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"633 ","pages":"Article 129753"},"PeriodicalIF":5.5000,"publicationDate":"2025-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225004254","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Traffic flow prediction is vital for the development of intelligent transportation systems. The challenge lies in accurately capturing the complex and dynamic spatiotemporal dependencies influenced by real road network fluctuations. These dependencies can be simplified into three categories: (i) spatial dependencies among sensors at the same timestamp, (ii) temporal dependencies of the same sensor at different timestamps, and (iii) cross dimensional dependencies between different sensors at different timestamps. The third type of cross dimensional dependency requires considering the relationships between different sensors across multiple time points, which is not only complex but also difficult to capture accurately. Existing methods often describe it indirectly by merging spatiotemporal dependencies, but this approach is frequently insufficiently accurate. We aim to characterize this relationship more precisely by capturing the sequential dependencies among sensors, referred to as inter-series dependencies. Capturing inter-series dependencies does not require directly modeling the relationships between different sensors across multiple time points; rather, it focuses on the dependencies between the temporal patterns of different sensors. Our designed Temporal Routing Transformer captures temporal dependencies along the temporal axis while implicitly modeling the inter-series dependencies between sensors. At the same time, we capture spatial dependencies through the Spatial Routing Transformer and multi-scale temporal dependencies by using the Context-Aware Transformer. A series of evaluations were conducted on seven real world datasets, and Routeformer achieved state-of-the-art performance.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.