{"title":"Touchplates: low-cost tactile overlays for visually impaired touch screen users","authors":"Shaun K. Kane, M. Morris, J. Wobbrock","doi":"10.1145/2513383.2513442","DOIUrl":"https://doi.org/10.1145/2513383.2513442","url":null,"abstract":"Adding tactile feedback to touch screens can improve their accessibility to blind users, but prior approaches to integrating tactile feedback with touch screens have either offered limited functionality or required extensive (and typically expensive) customization of the hardware. We introduce touchplates, carefully designed tactile guides that provide tactile feedback for touch screens in the form of physical guides that are overlaid on the screen and recognized by the underlying application. Unlike prior approaches to integrating tactile feedback with touch screens, touchplates are implemented with simple plastics and use standard touch screen software, making them versatile and inexpensive. Touchplates may be customized to suit individual users and applications, and may be produced on a laser cutter, 3D printer, or made by hand. We describe the design and implementation of touchplates, a \"starter kit\" of touchplates, and feedback from a formative evaluation with 9 people with visual impairments. Touchplates provide a low-cost, adaptable, and accessible method of adding tactile feedback to touch screen interfaces.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130831304","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Adam J. Sporka, Ben L. Carson, Paul Nauert, S. Kurniawan
{"title":"Toward accessible technology for music composers and producers with motor disabilities","authors":"Adam J. Sporka, Ben L. Carson, Paul Nauert, S. Kurniawan","doi":"10.1145/2513383.2513412","DOIUrl":"https://doi.org/10.1145/2513383.2513412","url":null,"abstract":"In an initial user study, three motor-impaired musicians---a composer with a degenerative motor neuron disease, a guitarist who suffered a stroke, and a first-year college student with impaired finger movement---identified prospective areas of research in assistive technology. Participants in the study made use of a range of technologies to adapt conventional software to their needs, and identified practical limitations and challenges in those adaptations, including suggestions for novel and intuitive interfaces, optimized control-surface layouts, and repurposing opportunities in text-input techniques.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131432241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Takahiro Miura, Ken-ichiro Yabu, M. Sakajiri, Mari Ueda, Junya Suzuki, Atsushi Hiyama, M. Hirose, T. Ifukube
{"title":"Social platform for sharing accessibility information among people with disabilities: evaluation of a field assessment","authors":"Takahiro Miura, Ken-ichiro Yabu, M. Sakajiri, Mari Ueda, Junya Suzuki, Atsushi Hiyama, M. Hirose, T. Ifukube","doi":"10.1145/2513383.2513391","DOIUrl":"https://doi.org/10.1145/2513383.2513391","url":null,"abstract":"Accessibility information can allow disabled people to identify suitable pathways to reach their destinations, but it is difficult to obtain new accessible pathway information rapidly because of limited local information disclosure. Thus, it is necessary to develop a comprehensive system that acquires barrier-free information from various sources and makes that information available in an intuitive form. In this study, we aimed to develop a social platform to obtain and present appropriate information depending on the user's situation, such as the user's disabilities and location, and to share the barrier-free information provided by other users.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132079645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mixed local and remote participation in teleconferences from a deaf and hard of hearing perspective","authors":"Christian Vogler, P. Tucker, Norman Williams","doi":"10.1145/2513383.2517035","DOIUrl":"https://doi.org/10.1145/2513383.2517035","url":null,"abstract":"In this experience report we describe the accessibility challenges that deaf and hard of hearing committee members faced while collaborating with a larger group of hearing committee members over a period of 2½ years. We explain what some recurring problems are, how audio-only conferences fall short even when relay services and interpreters are available, and how we devised a videoconferencing setup using FuzeMeeting to minimize the accessibility barriers. We also describe some best practices, as well as lessons learned, and pitfalls to avoid in deploying this type of setup.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133806965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SlideType: universal design text-entry interface for touchscreen devices","authors":"Xiao Xiong, J. Sanford","doi":"10.1145/2513383.2513429","DOIUrl":"https://doi.org/10.1145/2513383.2513429","url":null,"abstract":"In this work, we present SlideType, an intuitive text-entry system on touchscreen devices with touch and gestural inputs as well as visual and auditory output to be used by as many people as possible, including those have vision, dexterity and cognition impairments. 9 participants were tested using SlideType to input a name such as \"John Smith\" without training. Overall, participants were able to complete the task of typing and editing. We reported some usability study results.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123974345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Blind guidance system using situation information and activity-based instruction","authors":"Eunjeong Ko","doi":"10.1145/2513383.2517037","DOIUrl":"https://doi.org/10.1145/2513383.2517037","url":null,"abstract":"This study presents a situation-based wayfinding system for the visually impaired. The goal of our system is to guide visually impaired people to and from their destinations of choice. The proposed system was implemented on iPhone 4, which has embedded camera and inertial sensors. To assess the effectiveness of the proposed system, it was tested with 4 participants and the result confirmed the feasibility of the proposed system as a wayfinding aid for the visually impaired.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129044609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A system helping blind people to get character information in their surrounding environment","authors":"N. Ohnishi, T. Matsumoto, H. Kudo, Y. Takeuchi","doi":"10.1145/2513383.2513389","DOIUrl":"https://doi.org/10.1145/2513383.2513389","url":null,"abstract":"We propose a system helping blind people to get character information in their surrounding environment, such as merchandise information (name, price, and best-before/use-by date) and restaurant menu (name and price). The system consists of a computer, a wireless camera/scanner and an earphone. It processes images captured/scanned by a user and extracts character regions in the image by using Support Vector Machine (SVM). Applying Optical Character Recognition (OCR) to the extracted regions, the system outputs the character information as synthesized speech.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123754396","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Collaborative music application for visually impaired people with tangible objects on table","authors":"S. Omori, I. Yairi","doi":"10.1145/2513383.2513403","DOIUrl":"https://doi.org/10.1145/2513383.2513403","url":null,"abstract":"The collaborative work of visually impaired people and sighted people on equal ground plays a significant role for visually impaired people's social advance in society. We developed a collaborative application of music composition to achieve the goal mentioned above. This application has a beautiful tangible interface that would attract the attention of both visually impaired and sighted people, and multiple functions that are likely to induce collaborative communication among users. We demonstrated the experiment with six visually impaired people and six sighted people. In the experiment, the visually impaired people could lead the collaborative work without hesitating even in front of the sighted people whom they did not know very well. Then we focused our attention on the moment in which the visually impaired were having fun, and discussed the factor of the excitement.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"206 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116508053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Abeer S. Bayousuf, Hend Suliman Al-Khalifa, A. Al-Salman
{"title":"Towards the development of haptic-based interface for teaching visually impaired arabic handwriting","authors":"Abeer S. Bayousuf, Hend Suliman Al-Khalifa, A. Al-Salman","doi":"10.1145/2513383.2513400","DOIUrl":"https://doi.org/10.1145/2513383.2513400","url":null,"abstract":"This paper serves to introduce an initial haptic based system for teaching handwriting of Arabic letters to students with visual impairments. The proposed system provides full and partial guidance through haptic playback. In addition, the system automatically evaluates student progress.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"87 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129635544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Eyes-free yoga: an exergame using depth cameras for blind & low vision exercise","authors":"Kyle Rector, Cynthia L. Bennett, J. Kientz","doi":"10.1145/2513383.2513392","DOIUrl":"https://doi.org/10.1145/2513383.2513392","url":null,"abstract":"People who are blind or low vision may have a harder time participating in exercise classes due to inaccessibility, travel difficulties, or lack of experience. Exergames can encourage exercise at home and help lower the barrier to trying new activities, but there are often accessibility issues since they rely on visual feedback to help align body positions. To address this, we developed Eyes-Free Yoga, an exergame using the Microsoft Kinect that acts as a yoga instructor, teaches six yoga poses, and has customized auditory-only feedback based on skeletal tracking. We ran a controlled study with 16 people who are blind or low vision to evaluate the feasibility and feedback of Eyes-Free Yoga. We found participants enjoyed the game, and the extra auditory feedback helped their understanding of each pose. The findings of this work have implications for improving auditory-only feedback and on the design of exergames using depth cameras.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134312702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}