{"title":"Camera-odometer calibration and fusion using graph based optimization","authors":"Yijia He, Yue Guo, Aixue Ye, Kui Yuan","doi":"10.1109/ROBIO.2017.8324650","DOIUrl":null,"url":null,"abstract":"Monocular visual odometry (vo) estimates the camera motion only up to a scale which is prone to localization failure when the light is changing. The wheel encoders can provide metric information and accurate local localization. Fusing camera information with wheel odometer data is a good way to estimate robot motion. In such methods, calibrating camera-odometer extrinsic parameters and fusing sensor information to perform localization are key problems. We solve these problems by transforming the wheel odometry measurement to the camera frame that can construct a factor-graph edge between every two keyframes. By building factor graph, we can use graph-based optimization technology to estimate cameraodometer extrinsic parameters and fuse sensor information to estimate robot motion. We also derive the covariance matrix of the wheel odometry edges which is important when using graph-based optimization. Simulation experiments are used to validate the extrinsic calibration. For real-world experiments, we use our method to fuse the semi-direct visual odometry (SVO) with wheel encoder data, and the results show the fusion approach is effective.","PeriodicalId":197159,"journal":{"name":"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO.2017.8324650","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Monocular visual odometry (vo) estimates the camera motion only up to a scale which is prone to localization failure when the light is changing. The wheel encoders can provide metric information and accurate local localization. Fusing camera information with wheel odometer data is a good way to estimate robot motion. In such methods, calibrating camera-odometer extrinsic parameters and fusing sensor information to perform localization are key problems. We solve these problems by transforming the wheel odometry measurement to the camera frame that can construct a factor-graph edge between every two keyframes. By building factor graph, we can use graph-based optimization technology to estimate cameraodometer extrinsic parameters and fuse sensor information to estimate robot motion. We also derive the covariance matrix of the wheel odometry edges which is important when using graph-based optimization. Simulation experiments are used to validate the extrinsic calibration. For real-world experiments, we use our method to fuse the semi-direct visual odometry (SVO) with wheel encoder data, and the results show the fusion approach is effective.