{"title":"Indoor Navigation System Using Acoustic Cuing for Visually Impaired People","authors":"Guojun Yang, J. Saniie","doi":"10.1109/EIT.2018.8500192","DOIUrl":null,"url":null,"abstract":"Guiding visually impaired people autonomously is a challenging task. One of the major challenges is interfacing with visually impaired people efficiently. Sighted people are able to perceive information through their visions. Therefore, most human-computer interfaces are graphic based. These interfaces provide no help for visually impaired users. Existing UIs (User Interface) designed for visually impaired people are inadequate in terms of efficiency. A typical UI designed for visually impaired people helps its users using verbal cues. To be more specific, through a series of descriptive language, all information is passed to its users sequentially. In some applications, visually impaired users need to be updated constantly with information of their surroundings. Descriptive verbal cues might be insufficient when surroundings of users are constantly changing. In this paper, an UI using acoustic cues will be introduced. Unlike sequential verbal cues, acoustic cues can update its users constantly, therefore more adaptive to changing environments. To demonstrate the advantage of such acoustic cuing UI, an indoor object localization system was built. Regardless the complexity of the scenes, the proposed UI can always update its users in real-time using acoustic cues. The rules for generating such cues will be presented. Finally, the effectiveness of the UI will be discussed.","PeriodicalId":188414,"journal":{"name":"2018 IEEE International Conference on Electro/Information Technology (EIT)","volume":"112 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Electro/Information Technology (EIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EIT.2018.8500192","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Guiding visually impaired people autonomously is a challenging task. One of the major challenges is interfacing with visually impaired people efficiently. Sighted people are able to perceive information through their visions. Therefore, most human-computer interfaces are graphic based. These interfaces provide no help for visually impaired users. Existing UIs (User Interface) designed for visually impaired people are inadequate in terms of efficiency. A typical UI designed for visually impaired people helps its users using verbal cues. To be more specific, through a series of descriptive language, all information is passed to its users sequentially. In some applications, visually impaired users need to be updated constantly with information of their surroundings. Descriptive verbal cues might be insufficient when surroundings of users are constantly changing. In this paper, an UI using acoustic cues will be introduced. Unlike sequential verbal cues, acoustic cues can update its users constantly, therefore more adaptive to changing environments. To demonstrate the advantage of such acoustic cuing UI, an indoor object localization system was built. Regardless the complexity of the scenes, the proposed UI can always update its users in real-time using acoustic cues. The rules for generating such cues will be presented. Finally, the effectiveness of the UI will be discussed.