Traveller: Travel-pattern aware trajectory generation via autoregressive diffusion models

IF 15.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yuxiao Luo , Songming Zhang , Kang Liu , Yang Xu , Ling Yin
{"title":"Traveller: Travel-pattern aware trajectory generation via autoregressive diffusion models","authors":"Yuxiao Luo ,&nbsp;Songming Zhang ,&nbsp;Kang Liu ,&nbsp;Yang Xu ,&nbsp;Ling Yin","doi":"10.1016/j.inffus.2025.103766","DOIUrl":null,"url":null,"abstract":"<div><div>Trajectory Generation (TG) enables realistic simulation of individual movements for applications such as urban management, transportation planning, epidemic control, and privacy-preserving mobility analysis. However, existing TG methods, particularly unconditional diffusion models, struggle with spatiotemporal fidelity as they often overlook some travel patterns that are critical in an individual’s mobility behavior, such as recurrent location visits, movement scope, and temporal regularities. In this work, we propose the Autoregressive Diffusion Model for Travel-Pattern Aware Trajectory Generation (<strong>Traveller</strong>), a novel approach that integrates autoregressive travel-pattern modeling (AR-TempPlan) with diffusion-based trajectory generation (TravCond-Diff) to produce realistic and context-aware movement patterns. By leveraging the spatial anchor and temporal modes of visiting different locations, we derive an individual’s particular travel pattern as spatiotemporal constraints for guided trajectory generation. Building on this, AR-TempPlan generates a mask location sequence as the temporal modes, planning location transitions over time, while TravCond-Diff leverages this planning signal and home location, the spatial anchor, to guide spatial generation through a discrete diffusion process. Experiments on real-world datasets demonstrate that Traveller with the dual guidance mechanism enables the production of high-fidelity and individual trajectories that effectively capture complex human mobility behaviors while preserving privacy. The code and data are available at <span><span>https://github.com/YuxiaoLuo0013/Traveller</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"127 ","pages":"Article 103766"},"PeriodicalIF":15.5000,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525008280","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Trajectory Generation (TG) enables realistic simulation of individual movements for applications such as urban management, transportation planning, epidemic control, and privacy-preserving mobility analysis. However, existing TG methods, particularly unconditional diffusion models, struggle with spatiotemporal fidelity as they often overlook some travel patterns that are critical in an individual’s mobility behavior, such as recurrent location visits, movement scope, and temporal regularities. In this work, we propose the Autoregressive Diffusion Model for Travel-Pattern Aware Trajectory Generation (Traveller), a novel approach that integrates autoregressive travel-pattern modeling (AR-TempPlan) with diffusion-based trajectory generation (TravCond-Diff) to produce realistic and context-aware movement patterns. By leveraging the spatial anchor and temporal modes of visiting different locations, we derive an individual’s particular travel pattern as spatiotemporal constraints for guided trajectory generation. Building on this, AR-TempPlan generates a mask location sequence as the temporal modes, planning location transitions over time, while TravCond-Diff leverages this planning signal and home location, the spatial anchor, to guide spatial generation through a discrete diffusion process. Experiments on real-world datasets demonstrate that Traveller with the dual guidance mechanism enables the production of high-fidelity and individual trajectories that effectively capture complex human mobility behaviors while preserving privacy. The code and data are available at https://github.com/YuxiaoLuo0013/Traveller.
旅行者:通过自回归扩散模型生成旅行模式感知轨迹
轨迹生成(TG)能够为城市管理、交通规划、流行病控制和隐私保护移动分析等应用实现个人运动的逼真模拟。然而,现有的TG方法,特别是无条件扩散模型,在时空保真度方面存在问题,因为它们往往忽略了一些对个体移动行为至关重要的旅行模式,如反复的地点访问、运动范围和时间规律。在这项工作中,我们提出了用于旅行模式感知轨迹生成(traveler)的自回归扩散模型,这是一种将自回归旅行模式建模(AR-TempPlan)与基于扩散的轨迹生成(TravCond-Diff)集成在一起的新方法,以产生逼真的和上下文感知的运动模式。利用访问不同地点的空间锚点和时间模式,我们推导出个体的特定旅行模式作为制导轨迹生成的时空约束。在此基础上,AR-TempPlan生成一个掩模位置序列,作为时间模式,规划位置随时间变化,而TravCond-Diff利用这个规划信号和家园位置(空间锚点),通过离散的扩散过程来指导空间生成。在真实世界数据集上的实验表明,具有双重引导机制的旅行者能够产生高保真度和个性化的轨迹,有效地捕捉复杂的人类移动行为,同时保护隐私。代码和数据可在https://github.com/YuxiaoLuo0013/Traveller上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Fusion
Information Fusion 工程技术-计算机:理论方法
CiteScore
33.20
自引率
4.30%
发文量
161
审稿时长
7.9 months
期刊介绍: Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信