Accurate On-line Extrinsic Calibration for a Multi-camera SLAM System

O. F. Ince, Jun-Sik Kim
{"title":"Accurate On-line Extrinsic Calibration for a Multi-camera SLAM System","authors":"O. F. Ince, Jun-Sik Kim","doi":"10.1109/UR49135.2020.9144877","DOIUrl":null,"url":null,"abstract":"Simultaneous localization and mapping (SLAM) system has an important role in providing an accurate and comprehensive solution for situational awareness in unknown environments. In order to maximize the situational awareness, the wider field of view is required. It is possible to achieve a wide field of view with an omnidirectional lense or multiple perspective cameras. However, calibration of such systems is sensitive and difficult. For this reason, we present a practical solution to a multi-camera SLAM system. The goal of this study is to obtain robust localization and mapping for multi-camera setup without requiring pre-calibration of the camera system calibration. With this goal, we associate measurements from cameras with their relative poses and propose an iterative optimization method to refine the map, keyframe poses and relative poses between cameras simultaneously. We evaluated our method on a dataset which consists of three cameras with small overlapping regions, and on the KITTI odometry dataset which is set in stereo configuration. The experiments demonstrated that the proposed method provides not only a practical but also robust SLAM solution for multi-camera systems.","PeriodicalId":360208,"journal":{"name":"2020 17th International Conference on Ubiquitous Robots (UR)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 17th International Conference on Ubiquitous Robots (UR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UR49135.2020.9144877","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Simultaneous localization and mapping (SLAM) system has an important role in providing an accurate and comprehensive solution for situational awareness in unknown environments. In order to maximize the situational awareness, the wider field of view is required. It is possible to achieve a wide field of view with an omnidirectional lense or multiple perspective cameras. However, calibration of such systems is sensitive and difficult. For this reason, we present a practical solution to a multi-camera SLAM system. The goal of this study is to obtain robust localization and mapping for multi-camera setup without requiring pre-calibration of the camera system calibration. With this goal, we associate measurements from cameras with their relative poses and propose an iterative optimization method to refine the map, keyframe poses and relative poses between cameras simultaneously. We evaluated our method on a dataset which consists of three cameras with small overlapping regions, and on the KITTI odometry dataset which is set in stereo configuration. The experiments demonstrated that the proposed method provides not only a practical but also robust SLAM solution for multi-camera systems.
多相机SLAM系统的精确在线外部定标
同时定位与制图(SLAM)系统在为未知环境下的态势感知提供准确、全面的解决方案方面具有重要作用。为了最大限度地提高态势感知能力,需要更广阔的视野。使用全向镜头或多角度相机可以获得广阔的视野。然而,这种系统的校准是敏感和困难的。为此,我们提出了一种实用的多相机SLAM系统解决方案。本研究的目标是在不需要预先校准相机系统校准的情况下获得多相机设置的鲁棒定位和映射。为此,我们将相机的测量值与其相对姿态相关联,并提出了一种迭代优化方法来同时优化地图、关键帧姿态和相机之间的相对姿态。我们在由三个具有小重叠区域的摄像机组成的数据集和设置为立体配置的KITTI odometry数据集上评估了我们的方法。实验结果表明,该方法为多相机系统提供了一种实用且鲁棒的SLAM解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信