离散和连续极面几何与里程计的链式融合用于移动机器人的长期定位

David Tick, Jinglin Shen, Yinghua Zhang, N. Gans
{"title":"离散和连续极面几何与里程计的链式融合用于移动机器人的长期定位","authors":"David Tick, Jinglin Shen, Yinghua Zhang, N. Gans","doi":"10.1109/CCA.2011.6044426","DOIUrl":null,"url":null,"abstract":"This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.","PeriodicalId":208713,"journal":{"name":"2011 IEEE International Conference on Control Applications (CCA)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Chained fusion of discrete and continuous epipolar geometry with odometry for long-term localization of mobile robots\",\"authors\":\"David Tick, Jinglin Shen, Yinghua Zhang, N. Gans\",\"doi\":\"10.1109/CCA.2011.6044426\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.\",\"PeriodicalId\":208713,\"journal\":{\"name\":\"2011 IEEE International Conference on Control Applications (CCA)\",\"volume\":\"65 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE International Conference on Control Applications (CCA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCA.2011.6044426\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Conference on Control Applications (CCA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCA.2011.6044426","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

摘要

本文提出了一种传感器融合实现方法,通过将多种视觉里程法与车轮和IMU里程法相结合来提高机器人定位的精度。利用离散和连续的单应矩阵从跟踪特征点的图像序列中恢复机器人的姿态和速度。相机的有限视野是由链视觉为基础的运动估计解决。随着特征点离开视野,新的特征被获取和跟踪。当需要一组新的点时,运动估计被重新初始化并链接到之前的状态估计。扩展的卡尔曼滤波器融合了机器人车轮编码器与视觉和惯性测量系统的测量结果。扩展卡尔曼滤波器中的时变矩阵补偿传感器精度的已知变化,包括视觉特征无法可靠跟踪的时期。实验验证了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Chained fusion of discrete and continuous epipolar geometry with odometry for long-term localization of mobile robots
This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信