GDO-SLAM: Visual-Based Ground-Aware Decoupling Optimized SLAM for UGV in Outdoor Environments

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Chu Wu;Xu Li;Dong Kong;Yue Hu;Peizhou Ni
{"title":"GDO-SLAM: Visual-Based Ground-Aware Decoupling Optimized SLAM for UGV in Outdoor Environments","authors":"Chu Wu;Xu Li;Dong Kong;Yue Hu;Peizhou Ni","doi":"10.1109/JSEN.2024.3452114","DOIUrl":null,"url":null,"abstract":"Due to the homogeneity of the ground in outdoor scenes, i.e., self-similar textures, it is prone to cause inaccurate or even incorrect match of ground features. This mismatch inevitably introduces additional errors when calculating reprojection function, which in turn degrades the accuracy of simultaneous localization and mapping (SLAM). In this article, we propose a ground-aware decoupled optimized SLAM, called GDO-SLAM, which is essentially a pruning semantics-guided SLAM where a custom ground decoupling optimization module is introduced in the tracking and local mapping threads based on ORB-SLAM2. Essentially, the optimization module is a decoupling constraint that adds the weights of vertical observations of ground features and reduces the weights of horizontal observations in the reprojection error function. Specifically, we design a novel ground segmentation network that achieves an optimal balance between accuracy and real-time performance, and verify its ground category IoU of 98.6% on the urban landscape dataset. Extensive experiments on both the public KITTI dataset and our self-collected dataset demonstrate that our proposed ground-aware decoupling optimized SLAM (GDO-SLAM) outperforms the representative baseline ORB-SLAM2 in terms of translation and rotation accuracy by 7.5% and 8.3%, respectively.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 22","pages":"37218-37228"},"PeriodicalIF":4.3000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10706752/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Due to the homogeneity of the ground in outdoor scenes, i.e., self-similar textures, it is prone to cause inaccurate or even incorrect match of ground features. This mismatch inevitably introduces additional errors when calculating reprojection function, which in turn degrades the accuracy of simultaneous localization and mapping (SLAM). In this article, we propose a ground-aware decoupled optimized SLAM, called GDO-SLAM, which is essentially a pruning semantics-guided SLAM where a custom ground decoupling optimization module is introduced in the tracking and local mapping threads based on ORB-SLAM2. Essentially, the optimization module is a decoupling constraint that adds the weights of vertical observations of ground features and reduces the weights of horizontal observations in the reprojection error function. Specifically, we design a novel ground segmentation network that achieves an optimal balance between accuracy and real-time performance, and verify its ground category IoU of 98.6% on the urban landscape dataset. Extensive experiments on both the public KITTI dataset and our self-collected dataset demonstrate that our proposed ground-aware decoupling optimized SLAM (GDO-SLAM) outperforms the representative baseline ORB-SLAM2 in terms of translation and rotation accuracy by 7.5% and 8.3%, respectively.
GDO-SLAM:用于室外环境中 UGV 的基于视觉的地面感知解耦优化 SLAM
由于室外场景中地面的同质性,即自相似纹理,容易造成地面特征匹配不准确甚至错误。在计算重投影函数时,这种不匹配不可避免地会带来额外的误差,进而降低同步定位和映射(SLAM)的精度。在本文中,我们提出了一种地面感知解耦优化 SLAM,称为 GDO-SLAM,它本质上是一种剪枝语义引导的 SLAM,在基于 ORB-SLAM2 的跟踪和本地映射线程中引入了一个定制的地面解耦优化模块。从本质上讲,该优化模块是一个解耦约束,它在重投影误差函数中增加了地面特征垂直观测值的权重,减少了水平观测值的权重。具体来说,我们设计了一种新型地面分割网络,在准确性和实时性之间实现了最佳平衡,并在城市景观数据集上验证了其地面类别 IoU 为 98.6%。在公开的 KITTI 数据集和我们自己收集的数据集上进行的广泛实验表明,我们提出的地面感知解耦优化 SLAM(GDO-SLAM)在平移和旋转精度方面分别比具有代表性的基线 ORB-SLAM2 高出 7.5% 和 8.3%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信