{"title":"一种面向视障用户的智能楼宇定位改进方法","authors":"P. T. Mahida, S. Shahrestani, Hon Cheung","doi":"10.1109/iCIOTRP48773.2019.00010","DOIUrl":null,"url":null,"abstract":"Pedestrians have a variety of tools that can assist them in travelling, including maps, kiosks, and signage. However, these facilities are inaccessible to visually impaired users. Moreover, voice aided feature with Global Positioning System (GPS) cannot be adopted in indoor applications due to signal strength attenuation and multipath effects. Internet of Things (IoT) has become a backbone for such navigation applications that can assist in locating a user within IoT equipped smart buildings. Despite the growing use of Wi-Fi and beacon technologies, smartphones are uniquely positioned to be a critical part of a localization solution. The popularity of smartphone is increased by the diverse array of microelectromechanical (MEMS) inertial sensors. This paper discusses the adaptive distance estimation algorithm for visually impaired people. It represents the use of a smartphone in a situation where external proximity sensors fail to share location information to supplement an indoor navigation system in dark areas. An improved fusion algorithm is presented that adapts the walking style of a user detecting right turns and headings. The proposed fusion algorithm depends on inertial sensors to detect the relative position of the moving user with the absolute initial position using ibeacon. Tests were carried out to determine the accuracy of steps travelled, orientation and heading information for a user holding a smartphone. Our approach estimates heading and orientation from 3-axis inertial sensors (gyroscope, accelerometer and magnetometer) have shown accuracy more than 95%. The positioning root-mean-square error (RMSE) calculation results have demonstrated that the hybrid fusion algorithm can achieve a real-time positioning and reduce the error of indoor positioning.","PeriodicalId":220655,"journal":{"name":"2019 International Conference on Internet of Things Research and Practice (iCIOTRP)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"An Improved Positioning Method in a Smart Building for Visually Impaired Users\",\"authors\":\"P. T. Mahida, S. Shahrestani, Hon Cheung\",\"doi\":\"10.1109/iCIOTRP48773.2019.00010\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Pedestrians have a variety of tools that can assist them in travelling, including maps, kiosks, and signage. However, these facilities are inaccessible to visually impaired users. Moreover, voice aided feature with Global Positioning System (GPS) cannot be adopted in indoor applications due to signal strength attenuation and multipath effects. Internet of Things (IoT) has become a backbone for such navigation applications that can assist in locating a user within IoT equipped smart buildings. Despite the growing use of Wi-Fi and beacon technologies, smartphones are uniquely positioned to be a critical part of a localization solution. The popularity of smartphone is increased by the diverse array of microelectromechanical (MEMS) inertial sensors. This paper discusses the adaptive distance estimation algorithm for visually impaired people. It represents the use of a smartphone in a situation where external proximity sensors fail to share location information to supplement an indoor navigation system in dark areas. An improved fusion algorithm is presented that adapts the walking style of a user detecting right turns and headings. The proposed fusion algorithm depends on inertial sensors to detect the relative position of the moving user with the absolute initial position using ibeacon. Tests were carried out to determine the accuracy of steps travelled, orientation and heading information for a user holding a smartphone. Our approach estimates heading and orientation from 3-axis inertial sensors (gyroscope, accelerometer and magnetometer) have shown accuracy more than 95%. The positioning root-mean-square error (RMSE) calculation results have demonstrated that the hybrid fusion algorithm can achieve a real-time positioning and reduce the error of indoor positioning.\",\"PeriodicalId\":220655,\"journal\":{\"name\":\"2019 International Conference on Internet of Things Research and Practice (iCIOTRP)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Internet of Things Research and Practice (iCIOTRP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iCIOTRP48773.2019.00010\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Internet of Things Research and Practice (iCIOTRP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iCIOTRP48773.2019.00010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Improved Positioning Method in a Smart Building for Visually Impaired Users
Pedestrians have a variety of tools that can assist them in travelling, including maps, kiosks, and signage. However, these facilities are inaccessible to visually impaired users. Moreover, voice aided feature with Global Positioning System (GPS) cannot be adopted in indoor applications due to signal strength attenuation and multipath effects. Internet of Things (IoT) has become a backbone for such navigation applications that can assist in locating a user within IoT equipped smart buildings. Despite the growing use of Wi-Fi and beacon technologies, smartphones are uniquely positioned to be a critical part of a localization solution. The popularity of smartphone is increased by the diverse array of microelectromechanical (MEMS) inertial sensors. This paper discusses the adaptive distance estimation algorithm for visually impaired people. It represents the use of a smartphone in a situation where external proximity sensors fail to share location information to supplement an indoor navigation system in dark areas. An improved fusion algorithm is presented that adapts the walking style of a user detecting right turns and headings. The proposed fusion algorithm depends on inertial sensors to detect the relative position of the moving user with the absolute initial position using ibeacon. Tests were carried out to determine the accuracy of steps travelled, orientation and heading information for a user holding a smartphone. Our approach estimates heading and orientation from 3-axis inertial sensors (gyroscope, accelerometer and magnetometer) have shown accuracy more than 95%. The positioning root-mean-square error (RMSE) calculation results have demonstrated that the hybrid fusion algorithm can achieve a real-time positioning and reduce the error of indoor positioning.