{"title":"A Survey on Spatio-Temporal Prediction: From Transformers to Foundation Models","authors":"Yingchi Mao, Hongliang Zhou, Ling Chen, Rongzhi Qi, Zhende Sun, Yi Rong, Xiaoming He, Mingkai Chen, Shahid Mumtaz, Valerio Frascolla, Mohsen Guizani, Joel Rodrigues","doi":"10.1145/3766546","DOIUrl":null,"url":null,"abstract":"Spatio-Temporal (ST) data is pervasive on the various aspects in our daily lives. By mining the ST information from the data, we are able to predict trends in numerous domains. The Transformer, and one of its more recent enhancements, foundation models, have achieved a remarkable success in such ST prediction. In this paper, we first survey the state of the art of Transformers-related work, then introduce the network architecture of the Transformer and summarize the improvements to adapt to the ST prediction Transformer and foundation models, including module enhancement and adjustment. Subsequently, we categorize the ST Transformer and foundation models in selected applications in some relevant domains, mainly urban transportation, climate monitoring, and motion prediction. Next, we propose an evaluation method in the ST prediction with Transformers and foundation models, list the most relevant open-source datasets, evaluation metrics and performance analysis. Finally, we discuss some future directions on the task of ST prediction with Transformer and foundation models. <jats:styled-content style=\"black\">Relevant papers and open-source resources have been collated and are continuously updated at: https://github.com/cyhforlight/Spatio-Temporal-Prediction-Transformer-Review.</jats:styled-content>","PeriodicalId":50926,"journal":{"name":"ACM Computing Surveys","volume":"27 1","pages":""},"PeriodicalIF":28.0000,"publicationDate":"2025-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Computing Surveys","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3766546","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Spatio-Temporal (ST) data is pervasive on the various aspects in our daily lives. By mining the ST information from the data, we are able to predict trends in numerous domains. The Transformer, and one of its more recent enhancements, foundation models, have achieved a remarkable success in such ST prediction. In this paper, we first survey the state of the art of Transformers-related work, then introduce the network architecture of the Transformer and summarize the improvements to adapt to the ST prediction Transformer and foundation models, including module enhancement and adjustment. Subsequently, we categorize the ST Transformer and foundation models in selected applications in some relevant domains, mainly urban transportation, climate monitoring, and motion prediction. Next, we propose an evaluation method in the ST prediction with Transformers and foundation models, list the most relevant open-source datasets, evaluation metrics and performance analysis. Finally, we discuss some future directions on the task of ST prediction with Transformer and foundation models. Relevant papers and open-source resources have been collated and are continuously updated at: https://github.com/cyhforlight/Spatio-Temporal-Prediction-Transformer-Review.
期刊介绍:
ACM Computing Surveys is an academic journal that focuses on publishing surveys and tutorials on various areas of computing research and practice. The journal aims to provide comprehensive and easily understandable articles that guide readers through the literature and help them understand topics outside their specialties. In terms of impact, CSUR has a high reputation with a 2022 Impact Factor of 16.6. It is ranked 3rd out of 111 journals in the field of Computer Science Theory & Methods.
ACM Computing Surveys is indexed and abstracted in various services, including AI2 Semantic Scholar, Baidu, Clarivate/ISI: JCR, CNKI, DeepDyve, DTU, EBSCO: EDS/HOST, and IET Inspec, among others.