{"title":"Video-based realtime IMU-camera calibration for robot navigation","authors":"Arne Petersen, R. Koch","doi":"10.1117/12.924066","DOIUrl":null,"url":null,"abstract":"This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly \ncoupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the \ntransfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and \nthe horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation, \ninformation on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot \nnavigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the \nestimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters \ncan be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality \nand minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware \nis required. \nAs will be shown, the system is capable of estimating the calibration within a short period of time. Depending on \nthe requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system \nat startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality \nand consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and \nrotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the \nestimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability \nto robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered \nconditions easily.","PeriodicalId":369288,"journal":{"name":"Real-Time Image and Video Processing","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Real-Time Image and Video Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.924066","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly
coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the
transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and
the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation,
information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot
navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the
estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters
can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality
and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware
is required.
As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on
the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system
at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality
and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and
rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the
estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability
to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered
conditions easily.