{"title":"SLAM系统里程估计中图像特征点辅助点云匹配方案","authors":"You-Cheng Zhang, Y. Hwang","doi":"10.1109/RASSE54974.2022.9989637","DOIUrl":null,"url":null,"abstract":"This paper presents an image feature points assisted scheme to accelerate the process of point cloud matching in Odometry Estimation (OE) equipped with a Lidar camera. To calculate the changes of position and orientation of a camera across successive frames accurately, the corresponding point pairs between two laser point clouds must be identified first, which calls for a time-consuming iterative process. Conventional approaches utilize the laser point cloud data only and do not leverage the information of camera image to expedite the matching process. The proposed scheme analyzes the image first to identify the regions rich of feature points. Compared to flat regions, these regions serve better in point cloud matching. The size of the point could can be largely reduced by pruning out the regions less significant in terms of feature points. This speeds up the process without noticeable compromise of the matching accuracy. We implement the scheme in the odometry estimation module of a Simultaneous Localization and Mapping (SLAM) system and evaluate possible performance enhancement from the proposed scheme. Experimental results show that the enhancement in OE is more significant in a more planar environment. the time saving can be up to 18.9% and the deviation in path trajectory estimation is negligible.","PeriodicalId":382440,"journal":{"name":"2022 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An Image Feature Points Assisted Point Cloud Matching Scheme in Odometry Estimation for SLAM Systems\",\"authors\":\"You-Cheng Zhang, Y. Hwang\",\"doi\":\"10.1109/RASSE54974.2022.9989637\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an image feature points assisted scheme to accelerate the process of point cloud matching in Odometry Estimation (OE) equipped with a Lidar camera. To calculate the changes of position and orientation of a camera across successive frames accurately, the corresponding point pairs between two laser point clouds must be identified first, which calls for a time-consuming iterative process. Conventional approaches utilize the laser point cloud data only and do not leverage the information of camera image to expedite the matching process. The proposed scheme analyzes the image first to identify the regions rich of feature points. Compared to flat regions, these regions serve better in point cloud matching. The size of the point could can be largely reduced by pruning out the regions less significant in terms of feature points. This speeds up the process without noticeable compromise of the matching accuracy. We implement the scheme in the odometry estimation module of a Simultaneous Localization and Mapping (SLAM) system and evaluate possible performance enhancement from the proposed scheme. Experimental results show that the enhancement in OE is more significant in a more planar environment. the time saving can be up to 18.9% and the deviation in path trajectory estimation is negligible.\",\"PeriodicalId\":382440,\"journal\":{\"name\":\"2022 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/RASSE54974.2022.9989637\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RASSE54974.2022.9989637","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Image Feature Points Assisted Point Cloud Matching Scheme in Odometry Estimation for SLAM Systems
This paper presents an image feature points assisted scheme to accelerate the process of point cloud matching in Odometry Estimation (OE) equipped with a Lidar camera. To calculate the changes of position and orientation of a camera across successive frames accurately, the corresponding point pairs between two laser point clouds must be identified first, which calls for a time-consuming iterative process. Conventional approaches utilize the laser point cloud data only and do not leverage the information of camera image to expedite the matching process. The proposed scheme analyzes the image first to identify the regions rich of feature points. Compared to flat regions, these regions serve better in point cloud matching. The size of the point could can be largely reduced by pruning out the regions less significant in terms of feature points. This speeds up the process without noticeable compromise of the matching accuracy. We implement the scheme in the odometry estimation module of a Simultaneous Localization and Mapping (SLAM) system and evaluate possible performance enhancement from the proposed scheme. Experimental results show that the enhancement in OE is more significant in a more planar environment. the time saving can be up to 18.9% and the deviation in path trajectory estimation is negligible.