{"title":"A Fast Feature Tracking Algorithm for Visual Odometry and Mapping Based on RGB-D Sensors","authors":"Bruno M. F. Silva, L. Gonçalves","doi":"10.1109/SIBGRAPI.2014.13","DOIUrl":null,"url":null,"abstract":"The recent introduction of low cost sensors such as the Kinect allows the design of real-time applications (i.e. for Robotics) that exploit novel capabilities. One such application is Visual Odometry, a fundamental module of any robotic platform that uses the synchronized color/depth streams captured by these devices to build a map representation of the environment at the same that the robot is localized within the map. Aiming to minimize error accumulation inherent to the process of robot localization, we design a visual feature tracker that works as the front-end of a Visual Odometry system for RGB-D sensors. Feature points are added to the tracker selectively based on pre-specified criteria such as the number of currently active points and their spatial distribution throughout the image. Our proposal is a tracking strategy that allows real-time camera pose computation (average of 24.847 ms per frame) despite the fact that no specialized hardware (such as modern GPUs) is employed. Experiments carried out on publicly available benchmark and datasets demonstrate the usefulness of the method, which achieved RMSE rates superior to the state-of-the-art RGB-D SLAM algorithm.","PeriodicalId":146229,"journal":{"name":"2014 27th SIBGRAPI Conference on Graphics, Patterns and Images","volume":"97 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 27th SIBGRAPI Conference on Graphics, Patterns and Images","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIBGRAPI.2014.13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
The recent introduction of low cost sensors such as the Kinect allows the design of real-time applications (i.e. for Robotics) that exploit novel capabilities. One such application is Visual Odometry, a fundamental module of any robotic platform that uses the synchronized color/depth streams captured by these devices to build a map representation of the environment at the same that the robot is localized within the map. Aiming to minimize error accumulation inherent to the process of robot localization, we design a visual feature tracker that works as the front-end of a Visual Odometry system for RGB-D sensors. Feature points are added to the tracker selectively based on pre-specified criteria such as the number of currently active points and their spatial distribution throughout the image. Our proposal is a tracking strategy that allows real-time camera pose computation (average of 24.847 ms per frame) despite the fact that no specialized hardware (such as modern GPUs) is employed. Experiments carried out on publicly available benchmark and datasets demonstrate the usefulness of the method, which achieved RMSE rates superior to the state-of-the-art RGB-D SLAM algorithm.