{"title":"室内移动机器人多传感器融合定位与映射","authors":"Zhongwei Hua, Dongdong He","doi":"10.1109/AEMCSE55572.2022.00008","DOIUrl":null,"url":null,"abstract":"Due to the influence of overlapping objects, different materials, and uneven lighting in the indoor environment, the mobile robot equipped with only a single sensor cannot achieve accurate positioning and complete mapping. Aiming at the problem, this paper studies the multi-sensor fusion localization and mapping of indoor mobile robots. The research fuses multiple sensor data from 2D lidar, depth camera, IMU, and wheel encoder. Specifically, on the one hand, this paper uses the extended Kalman filter algorithm to fuse the wheel odometer calculated from the encoder data with the inertial sensing unit data, which reduces the drift error and improves the accuracy of the robot's own localization. On the other hand, the region proximity algorithm integrates richer visual information into the 2D laser data, which makes up for the spatial perception defect of single-line laser mapping and improves the spatial integrity of robot mapping. The simulation experiments in this paper verify that the proposed method can effectively improve the localization accuracy and mapping integrity of the indoor robot.","PeriodicalId":309096,"journal":{"name":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-Sensor Fusion Localization and Mapping of Indoor Mobile Robot\",\"authors\":\"Zhongwei Hua, Dongdong He\",\"doi\":\"10.1109/AEMCSE55572.2022.00008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the influence of overlapping objects, different materials, and uneven lighting in the indoor environment, the mobile robot equipped with only a single sensor cannot achieve accurate positioning and complete mapping. Aiming at the problem, this paper studies the multi-sensor fusion localization and mapping of indoor mobile robots. The research fuses multiple sensor data from 2D lidar, depth camera, IMU, and wheel encoder. Specifically, on the one hand, this paper uses the extended Kalman filter algorithm to fuse the wheel odometer calculated from the encoder data with the inertial sensing unit data, which reduces the drift error and improves the accuracy of the robot's own localization. On the other hand, the region proximity algorithm integrates richer visual information into the 2D laser data, which makes up for the spatial perception defect of single-line laser mapping and improves the spatial integrity of robot mapping. The simulation experiments in this paper verify that the proposed method can effectively improve the localization accuracy and mapping integrity of the indoor robot.\",\"PeriodicalId\":309096,\"journal\":{\"name\":\"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AEMCSE55572.2022.00008\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AEMCSE55572.2022.00008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multi-Sensor Fusion Localization and Mapping of Indoor Mobile Robot
Due to the influence of overlapping objects, different materials, and uneven lighting in the indoor environment, the mobile robot equipped with only a single sensor cannot achieve accurate positioning and complete mapping. Aiming at the problem, this paper studies the multi-sensor fusion localization and mapping of indoor mobile robots. The research fuses multiple sensor data from 2D lidar, depth camera, IMU, and wheel encoder. Specifically, on the one hand, this paper uses the extended Kalman filter algorithm to fuse the wheel odometer calculated from the encoder data with the inertial sensing unit data, which reduces the drift error and improves the accuracy of the robot's own localization. On the other hand, the region proximity algorithm integrates richer visual information into the 2D laser data, which makes up for the spatial perception defect of single-line laser mapping and improves the spatial integrity of robot mapping. The simulation experiments in this paper verify that the proposed method can effectively improve the localization accuracy and mapping integrity of the indoor robot.