Dimitrios Alexiou, Kosmas Tsiakas, I. Kostavelis, Dimitrios Giakoumis, A. Gasteratos, D. Tzovaras
{"title":"Loop-Closure Detection with 3D LiDAR Data for Extreme Viewpoint Changes","authors":"Dimitrios Alexiou, Kosmas Tsiakas, I. Kostavelis, Dimitrios Giakoumis, A. Gasteratos, D. Tzovaras","doi":"10.1109/MMAR55195.2022.9874344","DOIUrl":null,"url":null,"abstract":"Loop closure detection remains a challenging task for future autonomous robots operating in outdoor scenarios. When compared with RGB cameras, LiDAR sensors appear to be more appropriate for applications that involve significant environmental changes and various illumination conditions, facilitating more robust correlation of sensor measurements and estimation of the robot's global localization. The paper at hand presents a 3D point cloud-based method for loop closure detection that is tolerant to extreme viewpoint changes. Our method utilizes local 3D geometrical descriptors to tackle scenarios where the robot passes from the same place, yet with completely opposite direction, and is capable of understanding the similarity of the revisited area, in complete absence of common visual data in respective RGB images. To achieve this, rotation invariant Fast Point Feature Histograms (FPFHs) calculated over the Unsupervised Stable Interest Point Detection (USIP) keypoints formulate a descriptor matrix, upon which the similarity score for previously re-visited scenes is calculated. Probabilistic voting is applied to extract the top loop closure candidates and a geometric validation step is used for the final matching decision. Our method has been extensively verified on the state-of-art MulRan dataset as well as in a custom-built dataset11Publicly available dataset: https://github.com/dalexiou48/cav_pr_dataset acquired from an autonomous vehicle, that focuses on opposite traversing routes using a low-resolution LiDAR sensor.","PeriodicalId":169528,"journal":{"name":"2022 26th International Conference on Methods and Models in Automation and Robotics (MMAR)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 26th International Conference on Methods and Models in Automation and Robotics (MMAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMAR55195.2022.9874344","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Loop closure detection remains a challenging task for future autonomous robots operating in outdoor scenarios. When compared with RGB cameras, LiDAR sensors appear to be more appropriate for applications that involve significant environmental changes and various illumination conditions, facilitating more robust correlation of sensor measurements and estimation of the robot's global localization. The paper at hand presents a 3D point cloud-based method for loop closure detection that is tolerant to extreme viewpoint changes. Our method utilizes local 3D geometrical descriptors to tackle scenarios where the robot passes from the same place, yet with completely opposite direction, and is capable of understanding the similarity of the revisited area, in complete absence of common visual data in respective RGB images. To achieve this, rotation invariant Fast Point Feature Histograms (FPFHs) calculated over the Unsupervised Stable Interest Point Detection (USIP) keypoints formulate a descriptor matrix, upon which the similarity score for previously re-visited scenes is calculated. Probabilistic voting is applied to extract the top loop closure candidates and a geometric validation step is used for the final matching decision. Our method has been extensively verified on the state-of-art MulRan dataset as well as in a custom-built dataset11Publicly available dataset: https://github.com/dalexiou48/cav_pr_dataset acquired from an autonomous vehicle, that focuses on opposite traversing routes using a low-resolution LiDAR sensor.