DC-VINS: Dynamic Camera Visual Inertial Navigation System with Online Calibration

Jason Rebello, Chunshang Li, Steven L. Waslander
{"title":"DC-VINS: Dynamic Camera Visual Inertial Navigation System with Online Calibration","authors":"Jason Rebello, Chunshang Li, Steven L. Waslander","doi":"10.1109/ICCVW54120.2021.00289","DOIUrl":null,"url":null,"abstract":"Visual-inertial (VI) sensor combinations are becoming ubiquitous in a variety of autonomous driving and aerial navigation applications due to their low cost, limited power consumption and complementary sensing capabilities. However, current VI sensor configurations assume a static rigid transformation between the camera and IMU, precluding manipulating the viewpoint of the camera independent of IMU movement which is important in situations with uneven feature distribution and for high-rate dynamic motions. Gimbal stabilized cameras, as seen on most commercially available drones, have seen limited use in SLAM due to the inability to resolve the time-varying extrinsic calibration between the IMU and camera needed in tight sensor fusion. In this paper, we present the online extrinsic calibration between a dynamic camera mounted to an actuated mechanism and an IMU mounted to the body of the vehicle integrated into a Visual Odometry pipeline. In addition, we provide a degeneracy analysis of the calibration parameters leading to a novel parameterization of the actuated mechanism used in the calibration. We build our calibration into the VINS-Fusion package and show that we are able to accurately recover the calibration parameters online while manipulating the viewpoint of the camera to feature rich areas thereby achieving an average RMSE error of 0.26m over an average trajectory length of 340m, 31.45% lower than a traditional visual inertial pipeline with a static camera.","PeriodicalId":226794,"journal":{"name":"2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCVW54120.2021.00289","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Visual-inertial (VI) sensor combinations are becoming ubiquitous in a variety of autonomous driving and aerial navigation applications due to their low cost, limited power consumption and complementary sensing capabilities. However, current VI sensor configurations assume a static rigid transformation between the camera and IMU, precluding manipulating the viewpoint of the camera independent of IMU movement which is important in situations with uneven feature distribution and for high-rate dynamic motions. Gimbal stabilized cameras, as seen on most commercially available drones, have seen limited use in SLAM due to the inability to resolve the time-varying extrinsic calibration between the IMU and camera needed in tight sensor fusion. In this paper, we present the online extrinsic calibration between a dynamic camera mounted to an actuated mechanism and an IMU mounted to the body of the vehicle integrated into a Visual Odometry pipeline. In addition, we provide a degeneracy analysis of the calibration parameters leading to a novel parameterization of the actuated mechanism used in the calibration. We build our calibration into the VINS-Fusion package and show that we are able to accurately recover the calibration parameters online while manipulating the viewpoint of the camera to feature rich areas thereby achieving an average RMSE error of 0.26m over an average trajectory length of 340m, 31.45% lower than a traditional visual inertial pipeline with a static camera.
DC-VINS:在线标定的动态相机视觉惯性导航系统
视觉惯性(VI)传感器组合由于其低成本,有限的功耗和互补的传感能力,在各种自动驾驶和空中导航应用中变得无处不在。然而,目前的VI传感器配置假设相机和IMU之间的静态刚性转换,排除了独立于IMU运动的相机视点操作,这在特征分布不均匀和高速率动态运动的情况下很重要。云台稳定相机,正如在大多数商用无人机上看到的那样,在SLAM中的应用有限,因为无法解决IMU和相机之间的时变外部校准,需要紧密的传感器融合。在本文中,我们提出了安装在驱动机构上的动态摄像机与安装在车身上的IMU之间的在线外部校准,并将其集成到视觉里程计管道中。此外,我们还提供了校准参数的简并分析,从而导致校准中使用的驱动机构的新参数化。我们将我们的校准构建到VINS-Fusion软件包中,并表明我们能够在线准确地恢复校准参数,同时操纵相机的视点以特征丰富区域,从而在平均轨迹长度为340m的情况下实现平均RMSE误差0.26m,比传统的视觉惯性管道低31.45%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信