Loop-Closure Detection with 3D LiDAR Data for Extreme Viewpoint Changes

Dimitrios Alexiou, Kosmas Tsiakas, I. Kostavelis, Dimitrios Giakoumis, A. Gasteratos, D. Tzovaras
{"title":"Loop-Closure Detection with 3D LiDAR Data for Extreme Viewpoint Changes","authors":"Dimitrios Alexiou, Kosmas Tsiakas, I. Kostavelis, Dimitrios Giakoumis, A. Gasteratos, D. Tzovaras","doi":"10.1109/MMAR55195.2022.9874344","DOIUrl":null,"url":null,"abstract":"Loop closure detection remains a challenging task for future autonomous robots operating in outdoor scenarios. When compared with RGB cameras, LiDAR sensors appear to be more appropriate for applications that involve significant environmental changes and various illumination conditions, facilitating more robust correlation of sensor measurements and estimation of the robot's global localization. The paper at hand presents a 3D point cloud-based method for loop closure detection that is tolerant to extreme viewpoint changes. Our method utilizes local 3D geometrical descriptors to tackle scenarios where the robot passes from the same place, yet with completely opposite direction, and is capable of understanding the similarity of the revisited area, in complete absence of common visual data in respective RGB images. To achieve this, rotation invariant Fast Point Feature Histograms (FPFHs) calculated over the Unsupervised Stable Interest Point Detection (USIP) keypoints formulate a descriptor matrix, upon which the similarity score for previously re-visited scenes is calculated. Probabilistic voting is applied to extract the top loop closure candidates and a geometric validation step is used for the final matching decision. Our method has been extensively verified on the state-of-art MulRan dataset as well as in a custom-built dataset11Publicly available dataset: https://github.com/dalexiou48/cav_pr_dataset acquired from an autonomous vehicle, that focuses on opposite traversing routes using a low-resolution LiDAR sensor.","PeriodicalId":169528,"journal":{"name":"2022 26th International Conference on Methods and Models in Automation and Robotics (MMAR)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 26th International Conference on Methods and Models in Automation and Robotics (MMAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMAR55195.2022.9874344","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Loop closure detection remains a challenging task for future autonomous robots operating in outdoor scenarios. When compared with RGB cameras, LiDAR sensors appear to be more appropriate for applications that involve significant environmental changes and various illumination conditions, facilitating more robust correlation of sensor measurements and estimation of the robot's global localization. The paper at hand presents a 3D point cloud-based method for loop closure detection that is tolerant to extreme viewpoint changes. Our method utilizes local 3D geometrical descriptors to tackle scenarios where the robot passes from the same place, yet with completely opposite direction, and is capable of understanding the similarity of the revisited area, in complete absence of common visual data in respective RGB images. To achieve this, rotation invariant Fast Point Feature Histograms (FPFHs) calculated over the Unsupervised Stable Interest Point Detection (USIP) keypoints formulate a descriptor matrix, upon which the similarity score for previously re-visited scenes is calculated. Probabilistic voting is applied to extract the top loop closure candidates and a geometric validation step is used for the final matching decision. Our method has been extensively verified on the state-of-art MulRan dataset as well as in a custom-built dataset11Publicly available dataset: https://github.com/dalexiou48/cav_pr_dataset acquired from an autonomous vehicle, that focuses on opposite traversing routes using a low-resolution LiDAR sensor.
极端视点变化的3D激光雷达数据闭环检测
对于未来在户外操作的自主机器人来说,闭环检测仍然是一项具有挑战性的任务。与RGB相机相比,激光雷达传感器似乎更适合涉及重大环境变化和各种照明条件的应用,有助于传感器测量和机器人全局定位估计之间更强的相关性。本文提出了一种基于三维点云的闭环检测方法,该方法可以容忍极端的视点变化。我们的方法利用局部3D几何描述符来处理机器人从相同位置经过但方向完全相反的场景,并且能够在完全没有各自RGB图像中常见视觉数据的情况下理解重访区域的相似性。为了实现这一点,在无监督稳定兴趣点检测(USIP)关键点上计算的旋转不变快速点特征直方图(FPFHs)形成一个描述符矩阵,在此基础上计算先前重新访问的场景的相似性得分。采用概率投票的方法提取顶环闭合候选点,并采用几何验证步骤进行最终匹配决策。我们的方法已经在最先进的MulRan数据集以及定制的数据集(公共可用数据集:https://github.com/dalexiou48/cav_pr_dataset)上得到了广泛的验证,该数据集来自一辆使用低分辨率激光雷达传感器的自动驾驶汽车,专注于相反的穿越路线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信