TPLLM: A traffic prediction framework based on pretrained Large Language Models

IF 6.6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Tian Ma , Yixuan Zhao , Minda Li , Yue Chen , Fangshu Lei , Yanan Zhao , Maazen Alsabaan
{"title":"TPLLM: A traffic prediction framework based on pretrained Large Language Models","authors":"Tian Ma ,&nbsp;Yixuan Zhao ,&nbsp;Minda Li ,&nbsp;Yue Chen ,&nbsp;Fangshu Lei ,&nbsp;Yanan Zhao ,&nbsp;Maazen Alsabaan","doi":"10.1016/j.asoc.2025.113840","DOIUrl":null,"url":null,"abstract":"<div><div>Traffic prediction constitutes a critical component in sustainable urban data analysis, playing a pivotal role in optimizing transportation systems for reduced carbon emissions and improved energy efficiency. The precision of prevailing deep learning-driven traffic prediction models typically improves as the volume of training data increases. However, the procurement of comprehensive spatiotemporal datasets for traffic is often fraught with challenges, primarily stemming from the substantial costs associated with data collection and retention. This limitation severely hinders the deployment of models in regions with insufficient historical data. Consequently, developing a model that can achieve accurate predictions and good generalization ability in areas with limited historical traffic data is a challenging problem. It is noteworthy that the rapidly advancing pretrained Large Language Models (LLMs) of recent years demonstrate exceptional proficiency in cross-modality knowledge transfer and few-shot learning. Recognizing the sequential nature of traffic data, similar to language, we introduce TPLLM, a novel traffic prediction framework leveraging LLMs. In this framework, we construct a sequence embedding layer based on Convolutional Neural Networks (CNNs) and a graph embedding layer based on Graph Convolutional Networks (GCNs) to extract sequence features and spatial features, respectively. These are subsequently integrated to form inputs that are suitable for LLMs. A Low-Rank Adaptation (LoRA) fine-tuning approach is applied to TPLLM, thereby facilitating efficient learning and minimizing computational demands. Experiments on two real-world datasets demonstrate that TPLLM exhibits commendable performance in both full-sample and few-shot prediction scenarios.</div></div>","PeriodicalId":50737,"journal":{"name":"Applied Soft Computing","volume":"184 ","pages":"Article 113840"},"PeriodicalIF":6.6000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Soft Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1568494625011536","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Traffic prediction constitutes a critical component in sustainable urban data analysis, playing a pivotal role in optimizing transportation systems for reduced carbon emissions and improved energy efficiency. The precision of prevailing deep learning-driven traffic prediction models typically improves as the volume of training data increases. However, the procurement of comprehensive spatiotemporal datasets for traffic is often fraught with challenges, primarily stemming from the substantial costs associated with data collection and retention. This limitation severely hinders the deployment of models in regions with insufficient historical data. Consequently, developing a model that can achieve accurate predictions and good generalization ability in areas with limited historical traffic data is a challenging problem. It is noteworthy that the rapidly advancing pretrained Large Language Models (LLMs) of recent years demonstrate exceptional proficiency in cross-modality knowledge transfer and few-shot learning. Recognizing the sequential nature of traffic data, similar to language, we introduce TPLLM, a novel traffic prediction framework leveraging LLMs. In this framework, we construct a sequence embedding layer based on Convolutional Neural Networks (CNNs) and a graph embedding layer based on Graph Convolutional Networks (GCNs) to extract sequence features and spatial features, respectively. These are subsequently integrated to form inputs that are suitable for LLMs. A Low-Rank Adaptation (LoRA) fine-tuning approach is applied to TPLLM, thereby facilitating efficient learning and minimizing computational demands. Experiments on two real-world datasets demonstrate that TPLLM exhibits commendable performance in both full-sample and few-shot prediction scenarios.
TPLLM:基于预训练大语言模型的流量预测框架
交通预测是可持续城市数据分析的重要组成部分,在优化交通系统以减少碳排放和提高能源效率方面发挥着关键作用。流行的深度学习驱动的交通预测模型的精度通常随着训练数据量的增加而提高。然而,全面的交通时空数据集的采购往往充满挑战,主要源于与数据收集和保存相关的大量成本。这一限制严重阻碍了在历史数据不足的地区部署模型。因此,在历史交通数据有限的地区,开发一种能够实现准确预测和良好泛化能力的模型是一个具有挑战性的问题。值得注意的是,近年来快速发展的预训练大型语言模型(llm)在跨模态知识转移和少镜头学习方面表现出了非凡的熟练程度。认识到流量数据的顺序性质,类似于语言,我们引入了TPLLM,一种利用llm的新型流量预测框架。在该框架中,我们分别构建了基于卷积神经网络(cnn)的序列嵌入层和基于图卷积网络(GCNs)的图嵌入层来提取序列特征和空间特征。这些随后被整合成适合法学硕士的输入。将低秩自适应(LoRA)微调方法应用于TPLLM,从而促进了高效学习和最小化计算需求。在两个真实数据集上的实验表明,TPLLM在全样本和少样本预测场景下都表现出令人称赞的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Applied Soft Computing
Applied Soft Computing 工程技术-计算机:跨学科应用
CiteScore
15.80
自引率
6.90%
发文量
874
审稿时长
10.9 months
期刊介绍: Applied Soft Computing is an international journal promoting an integrated view of soft computing to solve real life problems.The focus is to publish the highest quality research in application and convergence of the areas of Fuzzy Logic, Neural Networks, Evolutionary Computing, Rough Sets and other similar techniques to address real world complexities. Applied Soft Computing is a rolling publication: articles are published as soon as the editor-in-chief has accepted them. Therefore, the web site will continuously be updated with new articles and the publication time will be short.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信