{"title":"Session details: Non-Visual Access to Graphics","authors":"","doi":"10.1145/3252337","DOIUrl":"https://doi.org/10.1145/3252337","url":null,"abstract":"","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"110 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115752278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"People with Parkinson's Disease Using Computers","authors":"Mia Hartikainen, S. Ovaska","doi":"10.1145/2700648.2811393","DOIUrl":"https://doi.org/10.1145/2700648.2811393","url":null,"abstract":"Parkinson--s disease is a neurological, progressive disease that affects the ability of using computer input devices. We present findings from a small-scale qualitative study on how the Finnish participants with Parkinson's disease experience their everyday computer use, and describe the solutions they have applied in their difficulties.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125224361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Pilot Study about the Smartwatch as Assistive Device for Deaf People","authors":"M. Mielke, R. Brück","doi":"10.1145/2700648.2811347","DOIUrl":"https://doi.org/10.1145/2700648.2811347","url":null,"abstract":"In the last years the smartphone became an important tool for deaf and hard of hearing people. It's no wonder that many different smartphone based assistive tools were introduced recently, among them tools for environmental sound awareness. Even though smartphones seem to be a good way to implement such tools, with the smartwatch a new class of mobile computing devices became available. In this paper results from interviews with six deaf people about the use of a smartwatch as environmental sound alert are presented. The interviews showed that a smartwatch based environmental sound alert is promising as the participants were highly interested in using such a device.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116963750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Haptic Gloves Prototype for Audio-Tactile Web Browsing","authors":"Andrii Soviak","doi":"10.1145/2700648.2811329","DOIUrl":"https://doi.org/10.1145/2700648.2811329","url":null,"abstract":"Blind people rely on screen readers to interact with the Web. Since screen readers narrate the digital content serially, blind users can only form a one-dimensional mental model of the web page and, hence, cannot enjoy the benefits inherently offered by the 2-D layout; e.g., understanding the spatial relations between objects in a webpage or their locations on the screen helps navigate webpages. Haptic interfaces could provide blind people with a tactile \"feel\" for the 2-D layout and help them navigate web pages more efficiently. Haptic Displays, capable of high resolution tactile feedback, could render any webpage in a tactile form enabling blind people to exploit the aforementioned spatial relations and focus screen reading on specific parts of the webpage. In this paper, I report on the preliminary work toward the development of FeelX -- a haptic gloves system that will enable tactile web browsing. FeelX will be used alongside regular screen readers, and will provide blind screen-reader users with the ability explore web pages by touch and audio.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"133 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121105605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SECPT Framework: A Travelling Experience Based Framework for Visually Impaired People","authors":"Jing Guan, C. Choy","doi":"10.1145/2700648.2811395","DOIUrl":"https://doi.org/10.1145/2700648.2811395","url":null,"abstract":"The right to fully participate into the community and enjoy the life is the same for people with or without a visual impairment. However, many existing research for assistive products or technology for visually impaired people are piecemeal, because there is no holistic and systemic research focus on the visually impaired people--s travelling experience. We propose the SECPT Framework which based on the visually impaired people--s travelling experience, can be a starting point of the design process for designers to encounter visually impaired people, and enable them to quick understanding the target group and determine their research area.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127101808","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Video Analysis for Includification Requirements","authors":"Alexander Hofmann, H. Hlavacs","doi":"10.1145/2700648.2811398","DOIUrl":"https://doi.org/10.1145/2700648.2811398","url":null,"abstract":"In extensive trials we asked participants with disabilities to play a set of computer games, demanding a variety of cognitive and physical skills. In particular we presented a Tetris variant we developed ourselves, which follows various includification guidelines. Participants were recorded on video in order to identify strengths and weaknesses in coping with the presented game challenges. In the videos we recorded the game screen, a frontal video of the participants showing emotional responses, and eye gaze. Post processing tasks are used to analyze and retrieve data from the videos. This helps to build up a video database for later analysis and answer questions raised after the recordings were taken.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"32 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126071309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Disability and Technology: A Critical Realist Perspective","authors":"C. Frauenberger","doi":"10.1145/2700648.2809851","DOIUrl":"https://doi.org/10.1145/2700648.2809851","url":null,"abstract":"Assistive technology (AT) as a field explores the design, use and evaluation of computing technology that aims to benefit people with disabilities. The majority of the work consequently takes the functional needs of people with disabilities as starting point and matches those with technological opportunity spaces. With this paper, we argue that the underlying philosophical position implied in this approach can be seen as reductionist as the disabled experience is arguably richer and often more complex as can be projected from the functional limitations of people. Thinkers and activists in Disability Studies have conceptualised disability in various ways and more recently, critical realism was proposed as a philosophical position through which the many different facets of the disabled experience could be incorporated. In this paper, we explore the possibility of using a critical realist perspective to guide designers in developing technology for people with disabilities and thereby aim to contribute to the philosophical underpinnings of AT. After a brief review of historical conceptualisations of disability, we introduce the critical realist argument and discuss its appeal for understanding disability and the possible roles technology can have in this context. Subsequently, we aim to translate this philosophical and moral debate into a research agenda for AT and exemplify how it can be operationalised by presenting the OutsideTheBox project as a case study.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122025231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aditya Vashistha, Edward Cutrell, Nicola Dell, Richard J. Anderson
{"title":"Social Media Platforms for Low-Income Blind People in India","authors":"Aditya Vashistha, Edward Cutrell, Nicola Dell, Richard J. Anderson","doi":"10.1145/2700648.2809858","DOIUrl":"https://doi.org/10.1145/2700648.2809858","url":null,"abstract":"We present the first analysis of the use and non-use of social media platforms by low-income blind users in rural and peri-urban India. Using a mixed-methods approach of semi-structured interviews and observations, we examine the benefits received by low-income blind people from Facebook, Twitter and WhatsApp and investigate constraints that impede their social media participation. We also present a detailed analysis of how low-income blind people used a voice-based social media platform deployed in India that received significant traction from low-income people in rural and peri-urban areas. In eleven-weeks of deployment, fifty-three blind participants in our sample collectively placed 4784 voice calls, contributed 1312 voice messages, cast 33,909 votes and listened to the messages 46,090 times. Using a mixed-methods analysis of call logs, qualitative interviews, and phone surveys, we evaluate the strengths and weaknesses of the platform and benefits it offered to low-income blind people.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128770901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kou Tanaka, T. Toda, Graham Neubig, S. Sakti, Satoshi Nakamura
{"title":"An Enhanced Electrolarynx with Automatic Fundamental Frequency Control based on Statistical Prediction","authors":"Kou Tanaka, T. Toda, Graham Neubig, S. Sakti, Satoshi Nakamura","doi":"10.1145/2700648.2811340","DOIUrl":"https://doi.org/10.1145/2700648.2811340","url":null,"abstract":"An electrolarynx is a type of speaking aid device which is able to mechanically generate excitation sounds to help laryngectomees produce electrolaryngeal (EL) speech. Although EL speech is quite intelligible, its naturalness suffers from monotonous fundamental frequency patterns of the mechanical excitation sounds. To make it possible to generate more natural excitation sounds, we have proposed a method to automatically control the fundamental frequency of the sounds generated by the electrolarynx based on a statistical prediction model, which predicts the fundamental frequency patterns from the produced EL speech in real-time. In this paper, we develop a prototype system by implementing the proposed control method in an actual, physical electrolarynx and evaluate its performance.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128252796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ChatWoz: Chatting through a Wizard of Oz","authors":"Pedro Fialho, Luísa Coheur","doi":"10.1145/2700648.2811334","DOIUrl":"https://doi.org/10.1145/2700648.2811334","url":null,"abstract":"Several cases of autistic children successfully interacting with virtual assistants such as Siri or Cortana have been recently reported. In this demo we describe ChatWoz, an application that can be used as a Wizard of Oz, to collect real data for dialogue systems, but also to allow children to interact with their caregivers through it, as it is based on a virtual agent. ChatWoz is composed of an interface controlled by the caregiver, which establishes what the agent will utter, in a synthesised voice. Several elements of the interface can be controlled, such as the agent's face emotions. In this paper we focus on the scenario of child-caregiver interaction and detail the features implemented in order to couple with it.","PeriodicalId":237212,"journal":{"name":"Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126560766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}