{"title":"DFT-VSLAM:动态光流跟踪 VSLAM 方法","authors":"Dupeng Cai, Shijiang Li, Wenlu Qi, Kunkun Ding, Junlin Lu, Guangfeng Liu, Zhuhua Hu","doi":"10.1007/s10846-024-02171-7","DOIUrl":null,"url":null,"abstract":"<p>Visual Simultaneous Localization and Mapping (VSLAM) technology can provide reliable visual localization and mapping capabilities for critical tasks. Existing VSLAM can extract accurate feature points in static environments for matching and pose estimation, and then build environmental map. However, in dynamic environments, the feature points extracted by the VSLAM system will become inaccurate points as the object moves, which not only leads to tracking failure but also seriously affects the accuracy of the environmental map. To alleviate these challenges, we propose a dynamic target-aware optical flow tracking method based on YOLOv8. Firstly, we use YOLOv8 to identify moving targets in the environment, and propose a method to eliminate dynamic points in the dynamic contour region. Secondly, we use the optical flow mask method to identify dynamic feature points outside the target detection object frame. Thirdly, we comprehensively eliminate the dynamic feature points. Finally, we combine the geometric and semantic information of static map points to construct the semantic map of the environment. We used ATE (Absolute Trajectory Error) and RPE (Relative Pose Error) as evaluation metrics and compared the original method with our method on the TUM dataset. The accuracy of our method is significantly improved, especially 96.92% on walking_xyz dataset. The experimental results show that our proposed method can significantly improve the overall performance of VSLAM systems under high dynamic environments.</p>","PeriodicalId":54794,"journal":{"name":"Journal of Intelligent & Robotic Systems","volume":"206 1","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DFT-VSLAM: A Dynamic Optical Flow Tracking VSLAM Method\",\"authors\":\"Dupeng Cai, Shijiang Li, Wenlu Qi, Kunkun Ding, Junlin Lu, Guangfeng Liu, Zhuhua Hu\",\"doi\":\"10.1007/s10846-024-02171-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Visual Simultaneous Localization and Mapping (VSLAM) technology can provide reliable visual localization and mapping capabilities for critical tasks. Existing VSLAM can extract accurate feature points in static environments for matching and pose estimation, and then build environmental map. However, in dynamic environments, the feature points extracted by the VSLAM system will become inaccurate points as the object moves, which not only leads to tracking failure but also seriously affects the accuracy of the environmental map. To alleviate these challenges, we propose a dynamic target-aware optical flow tracking method based on YOLOv8. Firstly, we use YOLOv8 to identify moving targets in the environment, and propose a method to eliminate dynamic points in the dynamic contour region. Secondly, we use the optical flow mask method to identify dynamic feature points outside the target detection object frame. Thirdly, we comprehensively eliminate the dynamic feature points. Finally, we combine the geometric and semantic information of static map points to construct the semantic map of the environment. We used ATE (Absolute Trajectory Error) and RPE (Relative Pose Error) as evaluation metrics and compared the original method with our method on the TUM dataset. The accuracy of our method is significantly improved, especially 96.92% on walking_xyz dataset. The experimental results show that our proposed method can significantly improve the overall performance of VSLAM systems under high dynamic environments.</p>\",\"PeriodicalId\":54794,\"journal\":{\"name\":\"Journal of Intelligent & Robotic Systems\",\"volume\":\"206 1\",\"pages\":\"\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Intelligent & Robotic Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s10846-024-02171-7\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent & Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10846-024-02171-7","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
DFT-VSLAM: A Dynamic Optical Flow Tracking VSLAM Method
Visual Simultaneous Localization and Mapping (VSLAM) technology can provide reliable visual localization and mapping capabilities for critical tasks. Existing VSLAM can extract accurate feature points in static environments for matching and pose estimation, and then build environmental map. However, in dynamic environments, the feature points extracted by the VSLAM system will become inaccurate points as the object moves, which not only leads to tracking failure but also seriously affects the accuracy of the environmental map. To alleviate these challenges, we propose a dynamic target-aware optical flow tracking method based on YOLOv8. Firstly, we use YOLOv8 to identify moving targets in the environment, and propose a method to eliminate dynamic points in the dynamic contour region. Secondly, we use the optical flow mask method to identify dynamic feature points outside the target detection object frame. Thirdly, we comprehensively eliminate the dynamic feature points. Finally, we combine the geometric and semantic information of static map points to construct the semantic map of the environment. We used ATE (Absolute Trajectory Error) and RPE (Relative Pose Error) as evaluation metrics and compared the original method with our method on the TUM dataset. The accuracy of our method is significantly improved, especially 96.92% on walking_xyz dataset. The experimental results show that our proposed method can significantly improve the overall performance of VSLAM systems under high dynamic environments.
期刊介绍:
The Journal of Intelligent and Robotic Systems bridges the gap between theory and practice in all areas of intelligent systems and robotics. It publishes original, peer reviewed contributions from initial concept and theory to prototyping to final product development and commercialization.
On the theoretical side, the journal features papers focusing on intelligent systems engineering, distributed intelligence systems, multi-level systems, intelligent control, multi-robot systems, cooperation and coordination of unmanned vehicle systems, etc.
On the application side, the journal emphasizes autonomous systems, industrial robotic systems, multi-robot systems, aerial vehicles, mobile robot platforms, underwater robots, sensors, sensor-fusion, and sensor-based control. Readers will also find papers on real applications of intelligent and robotic systems (e.g., mechatronics, manufacturing, biomedical, underwater, humanoid, mobile/legged robot and space applications, etc.).