Pedestrian trajectory prediction via physical-guided position association learning

IF 5.1 2区 工程技术 Q1 ENGINEERING, MULTIDISCIPLINARY
Yueyun Xu , Hongmao Qin , Yougang Bian , Rongjun Ding
{"title":"Pedestrian trajectory prediction via physical-guided position association learning","authors":"Yueyun Xu ,&nbsp;Hongmao Qin ,&nbsp;Yougang Bian ,&nbsp;Rongjun Ding","doi":"10.1016/j.jestch.2025.102008","DOIUrl":null,"url":null,"abstract":"<div><div>Pedestrian trajectory prediction possesses huge application value in automatic driving, robots, and video surveillance. Due to the complexity of the environment and the uncertainty of pedestrians, predicting pedestrian trajectories is a challenging task. Previous studies simply employ the LSTM or transformer structure to construct the deep model, which hardly adequately mines the dependency relationship among different pedestrian positions from different views. In addition, directly employing the deep model to output the prediction results is easy to be disturbed by the external factor. To this end, we propose the Physical-guided Position Association Learning (PPAL) method to adequately explore the inter-position dependency relationship with the guidance of the physical motion rule. Specifically, to build the long/short-distance relationship, we develop the position association learning module (PAL) to deeply correlate different position coordinates by utilizing the advantages of the LSTM and transformer structure, which could stimulate the deep model to better perceive the pedestrian intention. In addition, the future motion trajectory has a strong correlation with the previous position and speed. Its physical motion rules provide much prior knowledge and increase the reasonability of trajectory predictions. Hence, we design the physical position modeling (PPM) to utilize the motion rule for trajectory prediction. Finally, we integrate PAL and PPM into a framework to deeply learn the inter-position dependency relationship. Abundant experiments on three mainstream databases demonstrate that the proposed PPAL significantly improves the prediction performance and surpasses other advanced methods. A large number of quantitative analyses show that the predicted trajectory is very close to the real trajectories, indicating that the proposed method has a better forecasting ability.</div></div>","PeriodicalId":48609,"journal":{"name":"Engineering Science and Technology-An International Journal-Jestech","volume":"64 ","pages":"Article 102008"},"PeriodicalIF":5.1000,"publicationDate":"2025-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Science and Technology-An International Journal-Jestech","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2215098625000631","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Pedestrian trajectory prediction possesses huge application value in automatic driving, robots, and video surveillance. Due to the complexity of the environment and the uncertainty of pedestrians, predicting pedestrian trajectories is a challenging task. Previous studies simply employ the LSTM or transformer structure to construct the deep model, which hardly adequately mines the dependency relationship among different pedestrian positions from different views. In addition, directly employing the deep model to output the prediction results is easy to be disturbed by the external factor. To this end, we propose the Physical-guided Position Association Learning (PPAL) method to adequately explore the inter-position dependency relationship with the guidance of the physical motion rule. Specifically, to build the long/short-distance relationship, we develop the position association learning module (PAL) to deeply correlate different position coordinates by utilizing the advantages of the LSTM and transformer structure, which could stimulate the deep model to better perceive the pedestrian intention. In addition, the future motion trajectory has a strong correlation with the previous position and speed. Its physical motion rules provide much prior knowledge and increase the reasonability of trajectory predictions. Hence, we design the physical position modeling (PPM) to utilize the motion rule for trajectory prediction. Finally, we integrate PAL and PPM into a framework to deeply learn the inter-position dependency relationship. Abundant experiments on three mainstream databases demonstrate that the proposed PPAL significantly improves the prediction performance and surpasses other advanced methods. A large number of quantitative analyses show that the predicted trajectory is very close to the real trajectories, indicating that the proposed method has a better forecasting ability.
基于物理导向位置关联学习的行人轨迹预测
行人轨迹预测在自动驾驶、机器人、视频监控等领域具有巨大的应用价值。由于环境的复杂性和行人的不确定性,行人轨迹预测是一项具有挑战性的任务。以往的研究仅采用LSTM或变压器结构来构建深度模型,难以从不同角度充分挖掘不同行人位置之间的依赖关系。此外,直接采用深度模型输出预测结果容易受到外界因素的干扰。为此,我们提出了物理引导位置关联学习(physical -guided Position Association Learning, PPAL)方法,在物理运动规则的引导下,充分探索位置间的依赖关系。具体而言,为了建立远近关系,我们开发了位置关联学习模块(PAL),利用LSTM和变压器结构的优势,将不同的位置坐标进行深度关联,从而激发深度模型更好地感知行人意图。此外,未来的运动轨迹与之前的位置和速度有很强的相关性。它的物理运动规律提供了大量的先验知识,提高了轨迹预测的合理性。因此,我们设计了物理位置建模(PPM)来利用运动规则进行轨迹预测。最后,我们将PAL和PPM集成到一个框架中,以深入了解位置间依赖关系。在三个主流数据库上进行的大量实验表明,该方法显著提高了预测性能,并优于其他先进的方法。大量的定量分析表明,预测轨迹与实际轨迹非常接近,表明该方法具有较好的预测能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Engineering Science and Technology-An International Journal-Jestech
Engineering Science and Technology-An International Journal-Jestech Materials Science-Electronic, Optical and Magnetic Materials
CiteScore
11.20
自引率
3.50%
发文量
153
审稿时长
22 days
期刊介绍: Engineering Science and Technology, an International Journal (JESTECH) (formerly Technology), a peer-reviewed quarterly engineering journal, publishes both theoretical and experimental high quality papers of permanent interest, not previously published in journals, in the field of engineering and applied science which aims to promote the theory and practice of technology and engineering. In addition to peer-reviewed original research papers, the Editorial Board welcomes original research reports, state-of-the-art reviews and communications in the broadly defined field of engineering science and technology. The scope of JESTECH includes a wide spectrum of subjects including: -Electrical/Electronics and Computer Engineering (Biomedical Engineering and Instrumentation; Coding, Cryptography, and Information Protection; Communications, Networks, Mobile Computing and Distributed Systems; Compilers and Operating Systems; Computer Architecture, Parallel Processing, and Dependability; Computer Vision and Robotics; Control Theory; Electromagnetic Waves, Microwave Techniques and Antennas; Embedded Systems; Integrated Circuits, VLSI Design, Testing, and CAD; Microelectromechanical Systems; Microelectronics, and Electronic Devices and Circuits; Power, Energy and Energy Conversion Systems; Signal, Image, and Speech Processing) -Mechanical and Civil Engineering (Automotive Technologies; Biomechanics; Construction Materials; Design and Manufacturing; Dynamics and Control; Energy Generation, Utilization, Conversion, and Storage; Fluid Mechanics and Hydraulics; Heat and Mass Transfer; Micro-Nano Sciences; Renewable and Sustainable Energy Technologies; Robotics and Mechatronics; Solid Mechanics and Structure; Thermal Sciences) -Metallurgical and Materials Engineering (Advanced Materials Science; Biomaterials; Ceramic and Inorgnanic Materials; Electronic-Magnetic Materials; Energy and Environment; Materials Characterizastion; Metallurgy; Polymers and Nanocomposites)
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信