Gang Li , Jian Yu , Huilan Huang , Yongheng Zhu , Jinxiang Cai , Hao Luo , Xiaoman Xu , Chen Huang
{"title":"室外环境下精确视觉SLAM的动态目标移除和密集映射","authors":"Gang Li , Jian Yu , Huilan Huang , Yongheng Zhu , Jinxiang Cai , Hao Luo , Xiaoman Xu , Chen Huang","doi":"10.1016/j.measurement.2025.118172","DOIUrl":null,"url":null,"abstract":"<div><div>Visual SLAM systems face significant challenges in dynamic outdoor environments due to varying lighting conditions, the prevalence of moving objects, and distant small dynamic targets. To address these issues, we propose a stereo vision-based SLAM framework that integrates dynamic object removal and dense mapping. Potential dynamic features are identified using the moving consistency check module, and actual moving objects are eliminated via the dynamic region judgment module. The stereo camera configuration enables robust depth computation via an embedded stereo matching network, ensuring reliable metric scale estimation for dense mapping in autonomous navigation scenarios. Experimental validation on stereo-compatible datasets (KITTI, EuRoC, VIODE) demonstrates that our stereo vision-based method significantly improves trajectory accuracy in highly dynamic scenes, outperforming state-of-the-art approaches. On the 11 sequences of the KITTI dataset, our approach achieved an 11.16 % improvement in the Absolute Trajectory Error (ATE) metric compared to ORB-SLAM3. In highly dynamic scenes, the improvement in ATE reached as high as 36.40 %. Our method improves localization accuracy by 14.9 %–47.4 % compared to other state-of-the-art methods in ATE under highly dynamic conditions. Additionally, high-quality dense point cloud maps are generated, laying a solid foundation for advanced robotic applications.</div></div>","PeriodicalId":18349,"journal":{"name":"Measurement","volume":"256 ","pages":"Article 118172"},"PeriodicalIF":5.2000,"publicationDate":"2025-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dynamic object removal and dense mapping for accurate visual SLAM in outdoor environments\",\"authors\":\"Gang Li , Jian Yu , Huilan Huang , Yongheng Zhu , Jinxiang Cai , Hao Luo , Xiaoman Xu , Chen Huang\",\"doi\":\"10.1016/j.measurement.2025.118172\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Visual SLAM systems face significant challenges in dynamic outdoor environments due to varying lighting conditions, the prevalence of moving objects, and distant small dynamic targets. To address these issues, we propose a stereo vision-based SLAM framework that integrates dynamic object removal and dense mapping. Potential dynamic features are identified using the moving consistency check module, and actual moving objects are eliminated via the dynamic region judgment module. The stereo camera configuration enables robust depth computation via an embedded stereo matching network, ensuring reliable metric scale estimation for dense mapping in autonomous navigation scenarios. Experimental validation on stereo-compatible datasets (KITTI, EuRoC, VIODE) demonstrates that our stereo vision-based method significantly improves trajectory accuracy in highly dynamic scenes, outperforming state-of-the-art approaches. On the 11 sequences of the KITTI dataset, our approach achieved an 11.16 % improvement in the Absolute Trajectory Error (ATE) metric compared to ORB-SLAM3. In highly dynamic scenes, the improvement in ATE reached as high as 36.40 %. Our method improves localization accuracy by 14.9 %–47.4 % compared to other state-of-the-art methods in ATE under highly dynamic conditions. Additionally, high-quality dense point cloud maps are generated, laying a solid foundation for advanced robotic applications.</div></div>\",\"PeriodicalId\":18349,\"journal\":{\"name\":\"Measurement\",\"volume\":\"256 \",\"pages\":\"Article 118172\"},\"PeriodicalIF\":5.2000,\"publicationDate\":\"2025-06-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Measurement\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0263224125015313\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0263224125015313","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
Dynamic object removal and dense mapping for accurate visual SLAM in outdoor environments
Visual SLAM systems face significant challenges in dynamic outdoor environments due to varying lighting conditions, the prevalence of moving objects, and distant small dynamic targets. To address these issues, we propose a stereo vision-based SLAM framework that integrates dynamic object removal and dense mapping. Potential dynamic features are identified using the moving consistency check module, and actual moving objects are eliminated via the dynamic region judgment module. The stereo camera configuration enables robust depth computation via an embedded stereo matching network, ensuring reliable metric scale estimation for dense mapping in autonomous navigation scenarios. Experimental validation on stereo-compatible datasets (KITTI, EuRoC, VIODE) demonstrates that our stereo vision-based method significantly improves trajectory accuracy in highly dynamic scenes, outperforming state-of-the-art approaches. On the 11 sequences of the KITTI dataset, our approach achieved an 11.16 % improvement in the Absolute Trajectory Error (ATE) metric compared to ORB-SLAM3. In highly dynamic scenes, the improvement in ATE reached as high as 36.40 %. Our method improves localization accuracy by 14.9 %–47.4 % compared to other state-of-the-art methods in ATE under highly dynamic conditions. Additionally, high-quality dense point cloud maps are generated, laying a solid foundation for advanced robotic applications.
期刊介绍:
Contributions are invited on novel achievements in all fields of measurement and instrumentation science and technology. Authors are encouraged to submit novel material, whose ultimate goal is an advancement in the state of the art of: measurement and metrology fundamentals, sensors, measurement instruments, measurement and estimation techniques, measurement data processing and fusion algorithms, evaluation procedures and methodologies for plants and industrial processes, performance analysis of systems, processes and algorithms, mathematical models for measurement-oriented purposes, distributed measurement systems in a connected world.