{"title":"Session details: Accessibility Education","authors":"Aqueasha Martin-Hammond","doi":"10.1145/3254074","DOIUrl":"https://doi.org/10.1145/3254074","url":null,"abstract":"","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117206113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Rehabilitation and Clinical Technologies","authors":"Shaun K. Kane","doi":"10.1145/3254068","DOIUrl":"https://doi.org/10.1145/3254068","url":null,"abstract":"","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"143 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125338123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brandon T. Taylor, A. Dey, D. Siewiorek, A. Smailagic
{"title":"Customizable 3D Printed Tactile Maps as Interactive Overlays","authors":"Brandon T. Taylor, A. Dey, D. Siewiorek, A. Smailagic","doi":"10.1145/2982142.2982167","DOIUrl":"https://doi.org/10.1145/2982142.2982167","url":null,"abstract":"Though tactile maps have been shown to be useful tools for visually impaired individuals, their availability has been limited by manufacturing and design costs. In this paper, we present a system that uses 3D printing to (1) make tactile maps more affordable to produce, (2) allow visually impaired individuals to independently design and customize maps, and (3) provide interactivity using widely available mobile devices. Our system consists of three parts: a web interface, a modeling algorithm, and an interactive touchscreen application. Our web interface, hosted at www.tactilemaps.net, allows visually impaired individuals to create maps of any location on the globe while specifying (1) what features to map, (2) how the features should be represented by textures, and (3) where to place markers and labels. Our modeling algorithm accommodates user specifications to create map models with (1) multiple layers of continuously varying textures and (2) markers of various geometric shapes or braille characters. Our interactive application uses a novel approach to 3D printing tactile maps using conductive filament to provide touchscreen overlays that allow users to dynamically interact with the maps on a wide range of mobile devices. This paper details the implementation of our system. We also present findings from a user study validating the usability of our mapping interface and the utility of the maps produced. Finally, we discuss the limitations of our current implementation and the plans we have to improve our system based on feedback from our user study and additional interviews.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"118 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122626299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sumita Sharma, Saurabh Srivastava, K. Achary, Blessin Varkey, T. Heimonen, Jaakko Hakulinen, M. Turunen, Nitendra Rajput
{"title":"Gesture-based Interaction for Individuals with Developmental Disabilities in India","authors":"Sumita Sharma, Saurabh Srivastava, K. Achary, Blessin Varkey, T. Heimonen, Jaakko Hakulinen, M. Turunen, Nitendra Rajput","doi":"10.1145/2982142.2982166","DOIUrl":"https://doi.org/10.1145/2982142.2982166","url":null,"abstract":"Gesture-based interaction provides a multitude of benefits to individuals with disabilities, for example, enhancing social, motor and cognitive skills. However, applications that encourage self-efficacy by promoting a life-skill through simulations of real world scenarios are largely missing. We explore the benefits of using a gesture-based application for individuals with developmental disabilities. The context is a special school in New Delhi, Nai Disha, where we designed and developed an application, Kirana, that integrates arithmetic and social interaction to teach purchasing of items from a local grocery store. In our study, 18 participants with developmental disabilities, previously unable to visit a grocery store, used Kirana for three weeks. Our results indicate that gesture-based applications can teach a life skill and enable self-efficacy for individuals with developmental disabilities by breaking down complex tasks that require social, mathematical and decision-making skills.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116678795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Accessibility is Becoming Mainstream","authors":"R. Ladner","doi":"10.1145/2982142.2982180","DOIUrl":"https://doi.org/10.1145/2982142.2982180","url":null,"abstract":"Since 1976, when California State University Northridge (CSUN) began its Annual International Technology and Persons with Disabilities Conference, there have been specialized conferences with an accessibility theme. The first ACM ASSETS Conference was held in 1994 when 22 papers were presented. The Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) began its conference in 1979. The first biennial International Conference on Computers Helping People with Special Needs (under a different name) was held in 1988. Accessibility focused journals have existed since at least 1986. This history demonstrates that accessibility has grown into a separate field in research and practice. While this is true, more and more, accessibility has become mainstream. The mainstreaming of accessibility can be seen in its integration into academic computing departments, HCI conferences, and conferences in supporting fields such as computer vision and natural language processing. Most importantly, accessibility can be seen in products and services provided by mainstream industry. One early example of this is the standardization of closed captioning for television. We now have built-in screen readers for iOS and Android devices and Augmentative and Alternative Communication (AAC) devices are being supplemented by lower cost AAC apps on mainstream touchscreen tablets. Technologies like video chat, personal texting, speech recognition, optical character recognition, and speech synthesis have their roots in solving accessibility problems. Slowly, accessibility is moving into the academic curriculum in computing departments [1]. Web design and development courses are starting to cover accessibility in the WCAG 2.0 and ARIA standards. There are capstone courses that focus on accessibility at several universities. There will always be a need for specialized accessibility related devices and services, but moving forward accessibility will be provided by mainstream companies and accessibility solutions will become valuable to everyone, disabled or not. Mainstream technology companies are asking for more people with disabilities to join their diverse workforces.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116746457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Affordable Virtual Reality Learning Framework for Children with Neuro-Developmental Disorder","authors":"M. Gelsomini","doi":"10.1145/2982142.2982143","DOIUrl":"https://doi.org/10.1145/2982142.2982143","url":null,"abstract":"Our research explores wearable Immersive Virtual Reality (IVR) to support new forms of intervention for children with Neuro-Developmental Disorder (NDD). In cooperation with therapists at a local rehabilitation center, we developed and evaluated Wildcard, a system that exploits a low cost VR visor and enables the child to \"feel immersed\" in 3D worlds, using eye focus and head/body movements to interact with the virtual items. Wildcard includes a set of functionalities for the therapists to monitor children's interaction, to customize the virtual space for the specific needs of each subject, and to automatically collect data.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124537066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Cost of Turning Heads: A Comparison of a Head-Worn Display to a Smartphone for Supporting Persons with Aphasia in Conversation","authors":"Kristin Williams, Karyn Moffatt, Jonggi Hong, Yasmeen Faroqi-Shah, Leah Findlater","doi":"10.1145/2982142.2982165","DOIUrl":"https://doi.org/10.1145/2982142.2982165","url":null,"abstract":"Current symbol-based dictionaries providing vocabulary support for persons with the language disorder, aphasia, are housed on smartphones or other portable devices. To employ the support on these external devices requires the user to divert their attention away from their conversation partner, to the neglect of conversation dynamics like eye contact or verbal inflection. A prior study investigated head-worn displays (HWDs) as an alternative form factor for supporting glanceable, unobtrusive, and always-available conversation support, but it did not directly compare the HWD to a control condition. To address this limitation, we compared vocabulary support on a HWD to equivalent support on a smartphone in terms of overall experience, perceived focus, and conversational success. Lastly, we elicited critical discussion of how each device might be better designed for conversation support. Our work contributes (1) evidence that a HWD can support more efficient communication, (2) preliminary results that a HWD can provide a better overall experience using assistive vocabulary, and (3) a characterization of the design features persons with aphasia value in portable conversation support technologies. Our findings should motivate further work on head-worn conversation support for persons with aphasia.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128033445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Demonstration: Screen Reader Support for a Complex Interactive Science Simulation","authors":"Taliesin L. Smith, C. Lewis, Emily B. Moore","doi":"10.1145/2982142.2982154","DOIUrl":"https://doi.org/10.1145/2982142.2982154","url":null,"abstract":"Interactive simulations are increasingly important in science education, yet most are inaccessible to blind learners. We demonstrate an accessible version of a simulation, Balloons and Static Electricity, that illustrates responses to key challenges in providing screen reader support: the need to describe unpredictable sequences of events, the manipulation of objects that act as both controls and displays, and the management of descriptions of changes in the state of the simulation as well as of the state of the interactive object, itself. Meeting these challenges requires extending current practices for verbal description of visual interactive content.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125632819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Accessible Blocks Language: Work in Progress","authors":"Varsha Koushik, C. Lewis","doi":"10.1145/2982142.2982150","DOIUrl":"https://doi.org/10.1145/2982142.2982150","url":null,"abstract":"Block languages are extensively used to introduce programming to children. They replace the complex and error prone syntax of textual languages with simple shape cues that show how program elements can be combined. In their present form, blind learners cannot use them, because they rely on graphical presentation of code, and mouse interactions. We are working on a nonvisual blocks language called Pseudospatial Blocks (PB), that supports program creation using keyboard commands with synthetic speech output. It replaces visual shape cues for language syntax, the key feature of block languages, with filtering of program elements by syntactic category.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125979717","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real-Time Mobile Personalized Simulations of Impaired Colour Vision","authors":"Rhouri MacAlpine, David R. Flatla","doi":"10.1145/2982142.2982170","DOIUrl":"https://doi.org/10.1145/2982142.2982170","url":null,"abstract":"Colour forms an essential element of day-to-day life for most people, but at least 5% of the world have Impaired Colour Vision (ICV) - seeing fewer colours than everyone else. Those with typical colour vision find it difficult to understand how people with ICV perceive colour, leading to misunderstanding and challenges for people with ICV. To help improve understanding, personalized simulations of ICV have been developed, but are computationally demanding (so limited to static images), which limits the value of these simulations. To address this, we extended personalized ICV simulations to work in real time on a mobile device to allow people with typical colour vision greater freedom in exploring ICV. To validate our approach, we compared our real-time simulation technique to an existing adjustable simulation technique and found general agreement between the two. We then deployed three real-time personalized ICV simulations to nine people with typical colour vision, encouraging them to take photos of interesting colour situations. In just over one week, participants recorded over 450 real-world images of situations where their simulation presented a distinct challenge for their respective ICV participant. Through a questionnaire and discussion of photos with participants, we found that our solution provides a valuable mechanism for building understanding of ICV for people with typical colour vision.","PeriodicalId":306165,"journal":{"name":"Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130191879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}