{"title":"[POSTER] Fusion of Vision and Inertial Sensing for Accurate and Efficient Pose Tracking on Smartphones","authors":"Xin Yang, Xun Si, Tangli Xue, K. Cheng","doi":"10.1109/ISMAR.2015.23","DOIUrl":null,"url":null,"abstract":"This paper aims at accurate and efficient pose tracking of planar targets on modern smartphones. Existing methods, relying on either visual features or motion sensing based on built-in inertial sensors, are either too computationally expensive to achieve realtime performance on a smartphone, or too noisy to achieve sufficient tracking accuracy. In this paper we present a hybrid tracking method which can achieve real-time performance with high accuracy. Based on the same framework of a state-of-the-art visual feature tracking algorithm [5] which ensures accurate and reliable pose tracking, the proposed hybrid method significantly reduces its computational cost with the assistance of a phone's built-in inertial sensors. However, noises in inertial sensors and abrupt errors in feature tracking due to severe motion blurs could result in instability of the hybrid tracking system. To address this problem, we propose to employ an adaptive Kalman filter with abrupt error detection to robustly fuse the inertial and feature tracking results. We evaluated the proposed method on a dataset consisting of 16 video clips with synchronized inertial sensing data. Experimental results demonstrated our method's superior performance and accuracy on smartphones compared to a state-of-the-art vision tracking method [5]. The dataset will be made publicly available with the publication of this paper.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2015.23","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
This paper aims at accurate and efficient pose tracking of planar targets on modern smartphones. Existing methods, relying on either visual features or motion sensing based on built-in inertial sensors, are either too computationally expensive to achieve realtime performance on a smartphone, or too noisy to achieve sufficient tracking accuracy. In this paper we present a hybrid tracking method which can achieve real-time performance with high accuracy. Based on the same framework of a state-of-the-art visual feature tracking algorithm [5] which ensures accurate and reliable pose tracking, the proposed hybrid method significantly reduces its computational cost with the assistance of a phone's built-in inertial sensors. However, noises in inertial sensors and abrupt errors in feature tracking due to severe motion blurs could result in instability of the hybrid tracking system. To address this problem, we propose to employ an adaptive Kalman filter with abrupt error detection to robustly fuse the inertial and feature tracking results. We evaluated the proposed method on a dataset consisting of 16 video clips with synchronized inertial sensing data. Experimental results demonstrated our method's superior performance and accuracy on smartphones compared to a state-of-the-art vision tracking method [5]. The dataset will be made publicly available with the publication of this paper.