模块化动态道路物体轨迹预测方法

Ivan Saetchnikov, Victor Skakun, E. Tcherniavskaia
{"title":"模块化动态道路物体轨迹预测方法","authors":"Ivan Saetchnikov, Victor Skakun, E. Tcherniavskaia","doi":"10.1109/MetroAeroSpace57412.2023.10190032","DOIUrl":null,"url":null,"abstract":"Dynamical object trajectory prediction is one of the most sophisticated tasks in computer vision, but is a highly urgent problem applied to robotics and autonomous vehicles. In this paper the novel dynamical object trajectory prediction framework biLSCCS is presented based on a six-step approach: object detector based on YOLOv5, bidirectional LSTM encoder, mean-shift multimodal clustering, 4-layer MLP-based classification, synthesis, and bidirectional LSTM decoder. The proposed approach has been tested on two benchmark datasets for object tracking and trajectory prediction, namely ETH and Stanford Drone, which involve on-road dynamic objects observed from aerial view. The experimental findings suggest that the biLSCCS approach demonstrates competitive accuracy and robustness performance in comparison to the SGAN and STAR methods. Specifically, the approach achieves ADE, FDE, and IoU scores of 0.32, 0.72, and 0.55, respectively, on the ETH dataset, and 0.22, 0.51, and 0.49, respectively, on the Stanford Drone dataset.","PeriodicalId":153093,"journal":{"name":"2023 IEEE 10th International Workshop on Metrology for AeroSpace (MetroAeroSpace)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"biLSCCS: modular dynamical on-road objects trajectory prediction approach\",\"authors\":\"Ivan Saetchnikov, Victor Skakun, E. Tcherniavskaia\",\"doi\":\"10.1109/MetroAeroSpace57412.2023.10190032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Dynamical object trajectory prediction is one of the most sophisticated tasks in computer vision, but is a highly urgent problem applied to robotics and autonomous vehicles. In this paper the novel dynamical object trajectory prediction framework biLSCCS is presented based on a six-step approach: object detector based on YOLOv5, bidirectional LSTM encoder, mean-shift multimodal clustering, 4-layer MLP-based classification, synthesis, and bidirectional LSTM decoder. The proposed approach has been tested on two benchmark datasets for object tracking and trajectory prediction, namely ETH and Stanford Drone, which involve on-road dynamic objects observed from aerial view. The experimental findings suggest that the biLSCCS approach demonstrates competitive accuracy and robustness performance in comparison to the SGAN and STAR methods. Specifically, the approach achieves ADE, FDE, and IoU scores of 0.32, 0.72, and 0.55, respectively, on the ETH dataset, and 0.22, 0.51, and 0.49, respectively, on the Stanford Drone dataset.\",\"PeriodicalId\":153093,\"journal\":{\"name\":\"2023 IEEE 10th International Workshop on Metrology for AeroSpace (MetroAeroSpace)\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE 10th International Workshop on Metrology for AeroSpace (MetroAeroSpace)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MetroAeroSpace57412.2023.10190032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 10th International Workshop on Metrology for AeroSpace (MetroAeroSpace)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MetroAeroSpace57412.2023.10190032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

动态目标轨迹预测是计算机视觉中最复杂的任务之一,也是应用于机器人和自动驾驶汽车的一个非常紧迫的问题。本文提出了基于YOLOv5的目标检测、双向LSTM编码器、mean-shift多模态聚类、基于mlp的四层分类、综合和双向LSTM解码器六步方法的动态目标轨迹预测框架biLSCCS。该方法已经在两个用于目标跟踪和轨迹预测的基准数据集(ETH和Stanford Drone)上进行了测试,这两个基准数据集涉及从空中观察到的道路动态目标。实验结果表明,与SGAN和STAR方法相比,biLSCCS方法具有相当的准确性和鲁棒性。具体来说,该方法在ETH数据集上的ADE、FDE和IoU得分分别为0.32、0.72和0.55,在斯坦福无人机数据集上的ADE、FDE和IoU得分分别为0.22、0.51和0.49。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
biLSCCS: modular dynamical on-road objects trajectory prediction approach
Dynamical object trajectory prediction is one of the most sophisticated tasks in computer vision, but is a highly urgent problem applied to robotics and autonomous vehicles. In this paper the novel dynamical object trajectory prediction framework biLSCCS is presented based on a six-step approach: object detector based on YOLOv5, bidirectional LSTM encoder, mean-shift multimodal clustering, 4-layer MLP-based classification, synthesis, and bidirectional LSTM decoder. The proposed approach has been tested on two benchmark datasets for object tracking and trajectory prediction, namely ETH and Stanford Drone, which involve on-road dynamic objects observed from aerial view. The experimental findings suggest that the biLSCCS approach demonstrates competitive accuracy and robustness performance in comparison to the SGAN and STAR methods. Specifically, the approach achieves ADE, FDE, and IoU scores of 0.32, 0.72, and 0.55, respectively, on the ETH dataset, and 0.22, 0.51, and 0.49, respectively, on the Stanford Drone dataset.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信