个体轨迹的时空联合表征学习

IF 7.1 1区 地球科学 Q1 ENVIRONMENTAL STUDIES
Fei Huang , Jianrong Lv , Yang Yue
{"title":"个体轨迹的时空联合表征学习","authors":"Fei Huang ,&nbsp;Jianrong Lv ,&nbsp;Yang Yue","doi":"10.1016/j.compenvurbsys.2024.102144","DOIUrl":null,"url":null,"abstract":"<div><p>Individual trajectories, capturing significant human-environment interactions across space and time, serve as vital inputs for geospatial foundation models (GeoFMs). However, existing attempts at learning trajectory representations often encoded trajectory spatial-temporal relationships implicitly, which poses challenges in learning and representing spatiotemporal patterns accurately. Therefore, this paper proposes a joint spatial-temporal graph representation learning method (ST-GraphRL) to formalize structurally-explicit while learnable spatial-temporal dependencies into trajectory representations. The proposed ST-GraphRL consists of three compositions: (i) a weighted directed spatial-temporal graph to explicitly construct mobility interactions over space and time dimensions; (ii) a two-stage joint encoder (i.e., decoupling and fusion), to learn entangled spatial-temporal dependencies by independently decomposing and jointly aggregating features in space and time; (iii) a decoder guides ST-GraphRL to learn mobility regularities and randomness by simulating the spatial-temporal joint distributions of trajectories. Tested on three real-world human mobility datasets, the proposed ST-GraphRL outperformed all the baseline models in predicting movements' spatial-temporal distributions and preserving trajectory similarity with high spatial-temporal correlations. Furthermore, analyzing spatial-temporal features in latent space, it affirms that the ST-GraphRL can effectively capture underlying mobility patterns. The results may also provide insights into representation learnings of other geospatial data to achieve general-purpose data representations, promoting the progress of GeoFMs.</p></div>","PeriodicalId":48241,"journal":{"name":"Computers Environment and Urban Systems","volume":"112 ","pages":"Article 102144"},"PeriodicalIF":7.1000,"publicationDate":"2024-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Jointly spatial-temporal representation learning for individual trajectories\",\"authors\":\"Fei Huang ,&nbsp;Jianrong Lv ,&nbsp;Yang Yue\",\"doi\":\"10.1016/j.compenvurbsys.2024.102144\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Individual trajectories, capturing significant human-environment interactions across space and time, serve as vital inputs for geospatial foundation models (GeoFMs). However, existing attempts at learning trajectory representations often encoded trajectory spatial-temporal relationships implicitly, which poses challenges in learning and representing spatiotemporal patterns accurately. Therefore, this paper proposes a joint spatial-temporal graph representation learning method (ST-GraphRL) to formalize structurally-explicit while learnable spatial-temporal dependencies into trajectory representations. The proposed ST-GraphRL consists of three compositions: (i) a weighted directed spatial-temporal graph to explicitly construct mobility interactions over space and time dimensions; (ii) a two-stage joint encoder (i.e., decoupling and fusion), to learn entangled spatial-temporal dependencies by independently decomposing and jointly aggregating features in space and time; (iii) a decoder guides ST-GraphRL to learn mobility regularities and randomness by simulating the spatial-temporal joint distributions of trajectories. Tested on three real-world human mobility datasets, the proposed ST-GraphRL outperformed all the baseline models in predicting movements' spatial-temporal distributions and preserving trajectory similarity with high spatial-temporal correlations. Furthermore, analyzing spatial-temporal features in latent space, it affirms that the ST-GraphRL can effectively capture underlying mobility patterns. The results may also provide insights into representation learnings of other geospatial data to achieve general-purpose data representations, promoting the progress of GeoFMs.</p></div>\",\"PeriodicalId\":48241,\"journal\":{\"name\":\"Computers Environment and Urban Systems\",\"volume\":\"112 \",\"pages\":\"Article 102144\"},\"PeriodicalIF\":7.1000,\"publicationDate\":\"2024-07-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers Environment and Urban Systems\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0198971524000735\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENVIRONMENTAL STUDIES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers Environment and Urban Systems","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0198971524000735","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENVIRONMENTAL STUDIES","Score":null,"Total":0}
引用次数: 0

摘要

个体轨迹记录了人类与环境在空间和时间上的重要互动,是地理空间基础模型(GeoFMs)的重要输入。然而,现有的轨迹表征学习尝试往往对轨迹的时空关系进行隐式编码,这给准确学习和表征时空模式带来了挑战。因此,本文提出了一种空间-时间图联合表征学习方法(ST-GraphRL),将结构明确且可学习的空间-时间依赖关系形式化到轨迹表征中。所提出的 ST-GraphRL 由三部分组成:(i) 加权有向时空图,用于明确构建空间和时间维度上的移动性交互;(ii) 两阶段联合编码器(即解耦和融合),通过独立分解和联合聚合空间和时间特征来学习纠缠的时空依赖关系;(iii) 解码器,通过模拟轨迹的时空联合分布来引导 ST-GraphRL 学习移动性的规律性和随机性。在三个真实世界的人类移动数据集上进行测试后发现,所提出的 ST-GraphRL 在预测移动的时空分布和保持高时空相关性的轨迹相似性方面优于所有基线模型。此外,通过分析潜在空间中的时空特征,证实 ST-GraphRL 可以有效捕捉潜在的移动模式。这些结果还可以为其他地理空间数据的表示学习提供启示,从而实现通用数据表示,推动地理空间模型的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Jointly spatial-temporal representation learning for individual trajectories

Individual trajectories, capturing significant human-environment interactions across space and time, serve as vital inputs for geospatial foundation models (GeoFMs). However, existing attempts at learning trajectory representations often encoded trajectory spatial-temporal relationships implicitly, which poses challenges in learning and representing spatiotemporal patterns accurately. Therefore, this paper proposes a joint spatial-temporal graph representation learning method (ST-GraphRL) to formalize structurally-explicit while learnable spatial-temporal dependencies into trajectory representations. The proposed ST-GraphRL consists of three compositions: (i) a weighted directed spatial-temporal graph to explicitly construct mobility interactions over space and time dimensions; (ii) a two-stage joint encoder (i.e., decoupling and fusion), to learn entangled spatial-temporal dependencies by independently decomposing and jointly aggregating features in space and time; (iii) a decoder guides ST-GraphRL to learn mobility regularities and randomness by simulating the spatial-temporal joint distributions of trajectories. Tested on three real-world human mobility datasets, the proposed ST-GraphRL outperformed all the baseline models in predicting movements' spatial-temporal distributions and preserving trajectory similarity with high spatial-temporal correlations. Furthermore, analyzing spatial-temporal features in latent space, it affirms that the ST-GraphRL can effectively capture underlying mobility patterns. The results may also provide insights into representation learnings of other geospatial data to achieve general-purpose data representations, promoting the progress of GeoFMs.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
13.30
自引率
7.40%
发文量
111
审稿时长
32 days
期刊介绍: Computers, Environment and Urban Systemsis an interdisciplinary journal publishing cutting-edge and innovative computer-based research on environmental and urban systems, that privileges the geospatial perspective. The journal welcomes original high quality scholarship of a theoretical, applied or technological nature, and provides a stimulating presentation of perspectives, research developments, overviews of important new technologies and uses of major computational, information-based, and visualization innovations. Applied and theoretical contributions demonstrate the scope of computer-based analysis fostering a better understanding of environmental and urban systems, their spatial scope and their dynamics.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信