{"title":"Multi-Uncertainty Captured Multi-Robot Lidar Odometry and Mapping Framework for Large-Scale Environments","authors":"Guang-ming Xiong, Junyi Ma, Huilong Yu, Jingyi Xu, Jiahui Xu","doi":"10.1142/s2301385023410030","DOIUrl":null,"url":null,"abstract":"Multi-robot simultaneous localization and mapping (MR-SLAM) is of great importance for enhancing the efficiency of large-scale environment exploration. Despite remarkable advances in schemes for cooperation, there is a critical lack of approaches to handle multiple uncertainties inherent to MR-SLAM in large-scale environments. This paper proposes a multi-uncertainty captured multi-robot lidar odometry and mapping (MUC-LOAM) framework, to quantify and utilize the uncertainties of feature points and robot mutual poses in large-scale environments. A proposed hybrid weighting strategy for pose update is integrated into MUC-LOAM to handle feature uncertainty from distance changing and dynamic objects. A devised Bayesian Neural Network (BNN) is proposed to capture mutual pose uncertainty. Then the covariance propagation of quaternions to Euler angles conversion is leveraged to filter out unreliable mutual poses. Another covariance propagation through coordinate transformations in nonlinear optimization improves the accuracy of map merging. The feasibility and enhanced robustness of the proposed framework for large-scale exploration are validated on both public datasets and real-world experiments.","PeriodicalId":164619,"journal":{"name":"Unmanned Syst.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Unmanned Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s2301385023410030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Multi-robot simultaneous localization and mapping (MR-SLAM) is of great importance for enhancing the efficiency of large-scale environment exploration. Despite remarkable advances in schemes for cooperation, there is a critical lack of approaches to handle multiple uncertainties inherent to MR-SLAM in large-scale environments. This paper proposes a multi-uncertainty captured multi-robot lidar odometry and mapping (MUC-LOAM) framework, to quantify and utilize the uncertainties of feature points and robot mutual poses in large-scale environments. A proposed hybrid weighting strategy for pose update is integrated into MUC-LOAM to handle feature uncertainty from distance changing and dynamic objects. A devised Bayesian Neural Network (BNN) is proposed to capture mutual pose uncertainty. Then the covariance propagation of quaternions to Euler angles conversion is leveraged to filter out unreliable mutual poses. Another covariance propagation through coordinate transformations in nonlinear optimization improves the accuracy of map merging. The feasibility and enhanced robustness of the proposed framework for large-scale exploration are validated on both public datasets and real-world experiments.