{"title":"A Mobile Phone Wayfinding System for Visually Impaired Users.","authors":"J Coughlan, R Manduchi","doi":"10.3233/978-1-60750-042-1-849","DOIUrl":"https://doi.org/10.3233/978-1-60750-042-1-849","url":null,"abstract":"In past work [1] we described a novel system for aiding blind and visually impaired persons in wayfinding – the process of finding one’s way to a destination. This system consists of a mobile phone held by the user that detects and reads specially designed printed signs (Fig. 1) using its built-in camera. Experiments have demonstrated the ability of blind and visually impaired persons using the system to locate these signs (which are read aloud using synthetic speech) at distances of up to several meters [2]. \u0000 \u0000 \u0000 \u0000Fig. 1 \u0000 \u0000System concept. Bottom: system detects wayfinding sign and provides appropriate directional guidance. Top: Building floor plan shows placement of multiple signs, each identifying a unique location. \u0000 \u0000 \u0000 \u0000We have augmented our system with new functionality that provides directional guidance to a desired destination, rather than simply reading aloud every sign that is visible. Each sign identifies a unique location in the building, and thus the system is able to use these signs as a form of “indoor GPS” to determine the user’s current location. Whenever a sign is detected the system refers to a building map (previously loaded into memory) and provides audio directions (e.g. “turn left”) to guide the user to any desired destination from his/her current location (see Fig. 1). The ID of a sign, which identifies its spatial location, is encoded by the colors it contains (the four colors in the sign can be combined to form 24 different spatial arrangements). The particular arrangement of colors in a sign is deciphered by the mobile phone during detection. Special provisions have been added with respect to the original algorithm of [1] to improve detection robustness via shape analysis. If a larger set of ID’s is needed, then a bar code can be added to the sign, as in [1]. \u0000 \u0000Finally, we conducted experiments showing that visually impaired subjects are able to use the system to find locations on one floor of an office building.","PeriodicalId":89087,"journal":{"name":"Assistive technology research series","volume":"25 2009","pages":"849"},"PeriodicalIF":0.0,"publicationDate":"2009-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.3233/978-1-60750-042-1-849","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"29435418","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Staying in the Crosswalk: A System for Guiding Visually Impaired Pedestrians at Traffic Intersections.","authors":"V Ivanchenko, J Coughlan, H Shen","doi":"10.3233/978-1-60750-042-1-69","DOIUrl":"https://doi.org/10.3233/978-1-60750-042-1-69","url":null,"abstract":"<p><p>Traffic intersections are among the most dangerous parts of a blind or visually impaired person's travel. Our \"Crosswatch\" device [4] is a handheld (mobile phone) computer vision system for orienting visually impaired pedestrians to crosswalks, to help users avoid entering the crosswalk in the wrong direction and straying outside of it. This paper describes two new developments in the Crosswatch project: (a) a new computer vision algorithm to locate the more common - but less highly visible - standard \"two-stripe\" crosswalk pattern marked by two narrow stripes along the borders of the crosswalk; and (b) 3D analysis to estimate crosswalk location relative to the user, to help him/her stay inside the crosswalk (not merely pointing in the correct direction). Experiments with blind subjects using the system demonstrate the feasibility of the approach.</p>","PeriodicalId":89087,"journal":{"name":"Assistive technology research series","volume":"25 2009","pages":"69-73"},"PeriodicalIF":0.0,"publicationDate":"2009-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2964893/pdf/nihms159124.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"29435331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}