Chang Peng , Chengcheng Xu , Haibo Chen , Qi Ai , Guodong Zhang , Xu Cui
{"title":"Large language models as spatiotemporal graph learning enhancers for large-scale traffic forecasting","authors":"Chang Peng , Chengcheng Xu , Haibo Chen , Qi Ai , Guodong Zhang , Xu Cui","doi":"10.1080/19427867.2025.2601756","DOIUrl":null,"url":null,"abstract":"<div><div>Understanding the traffic dynamics in spatial and temporal dimensions is essential to network-wide forecasting. Spatiotemporal graph (STG)-based prediction emerges as a promising method by integrating graph and temporal neural networks. Inspired by the extensive knowledge of large language models (LLMs), this paper leverages their understanding on traffic phenomena to enhance spatiotemporal forecasting. The LLMs are regard as general knowledge identifiers to recognize traffic patterns and underlying factors as prior knowledge, which is further vectorized based on a language model. An attention-based module is developed to incorporate the vectorized knowledge into STG models. The proposed framework was applied on a real-world traffic dataset, with multiple LLMs, STG models, and prediction horizons to evaluate the effects of LLM-identified knowledge on prediction accuracy and training efficiency. The incorporated knowledge significantly enhances comparatively weaker STG predictors over a relatively long horizon, especially in rush hours. It also leads to notable acceleration in STG training.</div></div>","PeriodicalId":48974,"journal":{"name":"Transportation Letters-The International Journal of Transportation Research","volume":"18 4","pages":"Pages 721-737"},"PeriodicalIF":3.3000,"publicationDate":"2026-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Transportation Letters-The International Journal of Transportation Research","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/org/science/article/pii/S1942786725000773","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/12/29 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"TRANSPORTATION","Score":null,"Total":0}
引用次数: 0
Abstract
Understanding the traffic dynamics in spatial and temporal dimensions is essential to network-wide forecasting. Spatiotemporal graph (STG)-based prediction emerges as a promising method by integrating graph and temporal neural networks. Inspired by the extensive knowledge of large language models (LLMs), this paper leverages their understanding on traffic phenomena to enhance spatiotemporal forecasting. The LLMs are regard as general knowledge identifiers to recognize traffic patterns and underlying factors as prior knowledge, which is further vectorized based on a language model. An attention-based module is developed to incorporate the vectorized knowledge into STG models. The proposed framework was applied on a real-world traffic dataset, with multiple LLMs, STG models, and prediction horizons to evaluate the effects of LLM-identified knowledge on prediction accuracy and training efficiency. The incorporated knowledge significantly enhances comparatively weaker STG predictors over a relatively long horizon, especially in rush hours. It also leads to notable acceleration in STG training.
期刊介绍:
Transportation Letters: The International Journal of Transportation Research is a quarterly journal that publishes high-quality peer-reviewed and mini-review papers as well as technical notes and book reviews on the state-of-the-art in transportation research.
The focus of Transportation Letters is on analytical and empirical findings, methodological papers, and theoretical and conceptual insights across all areas of research. Review resource papers that merge descriptions of the state-of-the-art with innovative and new methodological, theoretical, and conceptual insights spanning all areas of transportation research are invited and of particular interest.