{"title":"复杂环境下室内移动机器人的视觉定位策略","authors":"Xiaohan Lei, Fei Zhang, Junyi Zhou, Weiwei Shang","doi":"10.1109/ICMA54519.2022.9856360","DOIUrl":null,"url":null,"abstract":"Vision-based mobile positioning technology has a broad application prospect. Still, it is easy to be disturbed by external environmental factors, and the positioning accuracy and robustness in a complex environment are poor. Therefore, this paper designs a high-precision visual positioning strategy for a complex environment via fusing stereo visual odometry and Inertial Measurement Unit (IMU) data. A multi-sensor calibration method is utilized to compensate for the measurement error of IMU and the parameter error of the stereo camera. A multi-sensor data synchronization alignment method based on timestamp is also designed to realize the synchronous acquisition and processing of multi-sensor data. Based on the Unscented Kalman Filter (UKF) algorithm, we implement a nonlinear data coupling method to fuse the stereo visual odometry and IMU information to obtain high-precision positioning. In the complex and open laboratory environment, the experimental results of fused localization show that the accuracy and robustness of the mobile robot localization are significantly improved. The global maximum error is reduced by 15%, and the variance is reduced by 5%.","PeriodicalId":120073,"journal":{"name":"2022 IEEE International Conference on Mechatronics and Automation (ICMA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Visual Localization Strategy for Indoor Mobile Robots in the Complex Environment\",\"authors\":\"Xiaohan Lei, Fei Zhang, Junyi Zhou, Weiwei Shang\",\"doi\":\"10.1109/ICMA54519.2022.9856360\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Vision-based mobile positioning technology has a broad application prospect. Still, it is easy to be disturbed by external environmental factors, and the positioning accuracy and robustness in a complex environment are poor. Therefore, this paper designs a high-precision visual positioning strategy for a complex environment via fusing stereo visual odometry and Inertial Measurement Unit (IMU) data. A multi-sensor calibration method is utilized to compensate for the measurement error of IMU and the parameter error of the stereo camera. A multi-sensor data synchronization alignment method based on timestamp is also designed to realize the synchronous acquisition and processing of multi-sensor data. Based on the Unscented Kalman Filter (UKF) algorithm, we implement a nonlinear data coupling method to fuse the stereo visual odometry and IMU information to obtain high-precision positioning. In the complex and open laboratory environment, the experimental results of fused localization show that the accuracy and robustness of the mobile robot localization are significantly improved. The global maximum error is reduced by 15%, and the variance is reduced by 5%.\",\"PeriodicalId\":120073,\"journal\":{\"name\":\"2022 IEEE International Conference on Mechatronics and Automation (ICMA)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Mechatronics and Automation (ICMA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMA54519.2022.9856360\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Mechatronics and Automation (ICMA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMA54519.2022.9856360","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Visual Localization Strategy for Indoor Mobile Robots in the Complex Environment
Vision-based mobile positioning technology has a broad application prospect. Still, it is easy to be disturbed by external environmental factors, and the positioning accuracy and robustness in a complex environment are poor. Therefore, this paper designs a high-precision visual positioning strategy for a complex environment via fusing stereo visual odometry and Inertial Measurement Unit (IMU) data. A multi-sensor calibration method is utilized to compensate for the measurement error of IMU and the parameter error of the stereo camera. A multi-sensor data synchronization alignment method based on timestamp is also designed to realize the synchronous acquisition and processing of multi-sensor data. Based on the Unscented Kalman Filter (UKF) algorithm, we implement a nonlinear data coupling method to fuse the stereo visual odometry and IMU information to obtain high-precision positioning. In the complex and open laboratory environment, the experimental results of fused localization show that the accuracy and robustness of the mobile robot localization are significantly improved. The global maximum error is reduced by 15%, and the variance is reduced by 5%.