Koji Kainou, H. Takizawa, Mayumi Aoyagi, N. Ezaki, S. Mizuno
{"title":"A Preliminary Study on Network-based ADL Training for the Visually Impaired by Use of a Parametric-speaker Robot","authors":"Koji Kainou, H. Takizawa, Mayumi Aoyagi, N. Ezaki, S. Mizuno","doi":"10.1145/2801040.2801043","DOIUrl":null,"url":null,"abstract":"This report proposes a network-based robot that can perform the interactive training of activities in daily life (ADL) for the visually impaired. The robot has a pan-tilt head equipped with a parametric speaker and a USB camera. A sighted trainer can observe the situation of a visually impaired trainee, and control the pan-tilt head to communicate with the trainee over the network. The trainer's voice is outputted from the parametric speaker and then reflected on the surface of a target object. The visually impaired trainee can hear the reflected voice as if it is generated from the object, and thus the robot can let the trainee know the object position. This robot system is applied to the ADL training where visually impaired individuals learn to use various objects arranged in their living environments. In this report, the experimental results of user study are shown.","PeriodicalId":399556,"journal":{"name":"Proceedings of the 8th International Symposium on Visual Information Communication and Interaction","volume":" 19","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 8th International Symposium on Visual Information Communication and Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2801040.2801043","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
This report proposes a network-based robot that can perform the interactive training of activities in daily life (ADL) for the visually impaired. The robot has a pan-tilt head equipped with a parametric speaker and a USB camera. A sighted trainer can observe the situation of a visually impaired trainee, and control the pan-tilt head to communicate with the trainee over the network. The trainer's voice is outputted from the parametric speaker and then reflected on the surface of a target object. The visually impaired trainee can hear the reflected voice as if it is generated from the object, and thus the robot can let the trainee know the object position. This robot system is applied to the ADL training where visually impaired individuals learn to use various objects arranged in their living environments. In this report, the experimental results of user study are shown.