Bryan Duarte, T. McDaniel, Abhik Chowdhury, Sana Gill, S. Panchanathan
{"title":"HaptWrap: Augmenting Non-Visual Travel via Visual-to-Tactile Mapping of Objects in Motion","authors":"Bryan Duarte, T. McDaniel, Abhik Chowdhury, Sana Gill, S. Panchanathan","doi":"10.1145/3347319.3356835","DOIUrl":null,"url":null,"abstract":"Access to real-time situational information at a distance, including the relative position and motion of surrounding objects, is essential for an individual to travel safely and independently. For blind and low vision travelers, access to critical environmental information is unattainable if it is positioned beyond the reach of their preferred mobility aid or outside their path of travel. Due to its cost and versatility, and the dynamic information which can be aggregated through its use, the long white cane remains the most widely used mobility aid for non-visual travelers. Physical characteristics such as texture, slope, and position can be identified with the long white cane, but only when the traveler is within close proximity to an object. In this work, we introduce a wearable technology to augment non-visual travel methods by communicating spatial information at a distance. We propose a vibrotactile device, the HaptWrap, equipped with vibration motors capable of communicating an object's position relative to the user's orientation, as well as its relative variations in position as the object moves about the user. An experiment supports the use of haptics to represent objects in motion around an individual as a substitute modality for vision.","PeriodicalId":420165,"journal":{"name":"Proceedings of the 2nd Workshop on Multimedia for Accessible Human Computer Interfaces","volume":"133 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd Workshop on Multimedia for Accessible Human Computer Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3347319.3356835","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Access to real-time situational information at a distance, including the relative position and motion of surrounding objects, is essential for an individual to travel safely and independently. For blind and low vision travelers, access to critical environmental information is unattainable if it is positioned beyond the reach of their preferred mobility aid or outside their path of travel. Due to its cost and versatility, and the dynamic information which can be aggregated through its use, the long white cane remains the most widely used mobility aid for non-visual travelers. Physical characteristics such as texture, slope, and position can be identified with the long white cane, but only when the traveler is within close proximity to an object. In this work, we introduce a wearable technology to augment non-visual travel methods by communicating spatial information at a distance. We propose a vibrotactile device, the HaptWrap, equipped with vibration motors capable of communicating an object's position relative to the user's orientation, as well as its relative variations in position as the object moves about the user. An experiment supports the use of haptics to represent objects in motion around an individual as a substitute modality for vision.