{"title":"Visual-aided Two-dimensional Pedestrian Indoor Navigation with a Smartphone","authors":"L. Ruotsalainen, H. Kuusniemi, Ruizhi Chen","doi":"10.5081/JGPS.10.1.11","DOIUrl":null,"url":null,"abstract":"Indoor pedestrian positioning sets severe challenges for a navigation system. To be applicable for pedestrian navigation the platform used has to be small in size and reasonably priced. Smartphones fulfill these requirements satisfyingly. GNSS signals are degraded indoors and in order to obtain accurate navigation aiding from other sensors is needed. Self-contained sensors provide valuable information about the motion of the pedestrian and when integrated with GNSS measurements a position solution is typically obtainable indoors. The accuracy is however decreased due to errors in the measurements of the self-contained sensors introduced by various environmental disturbances. When the effect of the disturbance is constrained using visualaiding the accuracy can be increased to an acceptable level. This paper introduces a visual-aided twodimensional indoor pedestrian navigation system integrating measurements from GNSS, Bluetooth, WLAN, self-contained sensors, and heading change information obtained from consecutive images. The integration is performed with an Extended Kalman filter. Reliability information of the heading change measurements calculated from images using vanishing points is provided to the filter and utilized in the integration. The visual-aiding algorithm is computationally lightweight taking into account the restricted resources of the smartphone. In the conducted experiment, the accuracy of the position solution is increased by 1.2 meters due to the visual-aiding.","PeriodicalId":237555,"journal":{"name":"Journal of Global Positioning Systems","volume":"115 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"39","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Global Positioning Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5081/JGPS.10.1.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 39
Abstract
Indoor pedestrian positioning sets severe challenges for a navigation system. To be applicable for pedestrian navigation the platform used has to be small in size and reasonably priced. Smartphones fulfill these requirements satisfyingly. GNSS signals are degraded indoors and in order to obtain accurate navigation aiding from other sensors is needed. Self-contained sensors provide valuable information about the motion of the pedestrian and when integrated with GNSS measurements a position solution is typically obtainable indoors. The accuracy is however decreased due to errors in the measurements of the self-contained sensors introduced by various environmental disturbances. When the effect of the disturbance is constrained using visualaiding the accuracy can be increased to an acceptable level. This paper introduces a visual-aided twodimensional indoor pedestrian navigation system integrating measurements from GNSS, Bluetooth, WLAN, self-contained sensors, and heading change information obtained from consecutive images. The integration is performed with an Extended Kalman filter. Reliability information of the heading change measurements calculated from images using vanishing points is provided to the filter and utilized in the integration. The visual-aiding algorithm is computationally lightweight taking into account the restricted resources of the smartphone. In the conducted experiment, the accuracy of the position solution is increased by 1.2 meters due to the visual-aiding.