Mikaylah Gross, Joe Dara, Christopher Meyer, D. Bolchini
{"title":"通过无屏幕访问探索听觉导航","authors":"Mikaylah Gross, Joe Dara, Christopher Meyer, D. Bolchini","doi":"10.1145/3192714.3192815","DOIUrl":null,"url":null,"abstract":"When people who are blind or visually impaired navigate the mobile web, they have to hold a phone in their hands at all times. Such continuous, two-handed interaction on a small screen hampers the user's ability to keep hands free to control aiding devices (e.g., cane) or touch objects nearby, especially on-the-go. In this paper, we introduce screenless access: a browsing approach that enables users to interact touch-free with aural navigation architectures using one-handed, in-air gestures recognized by an off-the-shelf armband. In a study with ten participants who are blind or visually impaired, we observed proficient navigation performance, users conceptual fit with a screen-free paradigm, and low levels of cognitive load. Our findings model the errors users made due to limits of the design and system proposed, uncover navigation styles that participants used, and illustrate unprompted adaptations of gestures that were enacted effectively to appropriate the technology. User feedback revealed insights into the potential and limitations of screenless navigation to support convenience in traveling, work contexts and privacy-preserving scenarios, as well as concerns about gestures that may become socially conspicuous.","PeriodicalId":330095,"journal":{"name":"Proceedings of the Internet of Accessible Things","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Exploring Aural Navigation by Screenless Access\",\"authors\":\"Mikaylah Gross, Joe Dara, Christopher Meyer, D. Bolchini\",\"doi\":\"10.1145/3192714.3192815\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"When people who are blind or visually impaired navigate the mobile web, they have to hold a phone in their hands at all times. Such continuous, two-handed interaction on a small screen hampers the user's ability to keep hands free to control aiding devices (e.g., cane) or touch objects nearby, especially on-the-go. In this paper, we introduce screenless access: a browsing approach that enables users to interact touch-free with aural navigation architectures using one-handed, in-air gestures recognized by an off-the-shelf armband. In a study with ten participants who are blind or visually impaired, we observed proficient navigation performance, users conceptual fit with a screen-free paradigm, and low levels of cognitive load. Our findings model the errors users made due to limits of the design and system proposed, uncover navigation styles that participants used, and illustrate unprompted adaptations of gestures that were enacted effectively to appropriate the technology. User feedback revealed insights into the potential and limitations of screenless navigation to support convenience in traveling, work contexts and privacy-preserving scenarios, as well as concerns about gestures that may become socially conspicuous.\",\"PeriodicalId\":330095,\"journal\":{\"name\":\"Proceedings of the Internet of Accessible Things\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-04-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Internet of Accessible Things\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3192714.3192815\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Internet of Accessible Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3192714.3192815","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
When people who are blind or visually impaired navigate the mobile web, they have to hold a phone in their hands at all times. Such continuous, two-handed interaction on a small screen hampers the user's ability to keep hands free to control aiding devices (e.g., cane) or touch objects nearby, especially on-the-go. In this paper, we introduce screenless access: a browsing approach that enables users to interact touch-free with aural navigation architectures using one-handed, in-air gestures recognized by an off-the-shelf armband. In a study with ten participants who are blind or visually impaired, we observed proficient navigation performance, users conceptual fit with a screen-free paradigm, and low levels of cognitive load. Our findings model the errors users made due to limits of the design and system proposed, uncover navigation styles that participants used, and illustrate unprompted adaptations of gestures that were enacted effectively to appropriate the technology. User feedback revealed insights into the potential and limitations of screenless navigation to support convenience in traveling, work contexts and privacy-preserving scenarios, as well as concerns about gestures that may become socially conspicuous.