{"title":"Demonstration Paper: Adaptive Ego-Motion Tracking Using Visual-Inertial Sensors for Wearable Blind Navigation","authors":"Hongsheng He, Jindong Tan","doi":"10.1145/2668883.2669590","DOIUrl":null,"url":null,"abstract":"This paper presents an ego-motion tracking method using visual-inertial sensors to assist the visually impaired and blind (VIB) people to travel in unknown dynamic environments. We focus on the ego-motion tracking functionality to inform the wearers of their relative position with respect to the environment. A traveled trajectory is recovered by concatenating the transformation estimated from visual correspondences and instantaneous movements captured by inertial sensors. Therefore, we introduce an adaptive mechanism to judge the reliability of visual tracking by comparing the estimated rotation with the gyroscopic measurement. The measuring frequencies of visual and inertial sensors are different because of different physical sampling rates and the introduced adaptive mechanism. We adopt the multi-rate extended Kalman filter (EKF) to fuse the visual estimation and inertial measurement. In the experiment, we wear the navigation system to follow a path in an indoor environment, and the results show the effectiveness and precision of the proposed methods in ego-motion tracking.","PeriodicalId":185800,"journal":{"name":"Proceedings of the Wireless Health 2014 on National Institutes of Health","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Wireless Health 2014 on National Institutes of Health","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2668883.2669590","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
This paper presents an ego-motion tracking method using visual-inertial sensors to assist the visually impaired and blind (VIB) people to travel in unknown dynamic environments. We focus on the ego-motion tracking functionality to inform the wearers of their relative position with respect to the environment. A traveled trajectory is recovered by concatenating the transformation estimated from visual correspondences and instantaneous movements captured by inertial sensors. Therefore, we introduce an adaptive mechanism to judge the reliability of visual tracking by comparing the estimated rotation with the gyroscopic measurement. The measuring frequencies of visual and inertial sensors are different because of different physical sampling rates and the introduced adaptive mechanism. We adopt the multi-rate extended Kalman filter (EKF) to fuse the visual estimation and inertial measurement. In the experiment, we wear the navigation system to follow a path in an indoor environment, and the results show the effectiveness and precision of the proposed methods in ego-motion tracking.