{"title":"Multichannel audio aided dynamical perception for prosthetic hand biofeedback","authors":"José González, Wenwei Yu","doi":"10.1109/ICORR.2009.5209521","DOIUrl":null,"url":null,"abstract":"Visual input is one of the prerequisites for most biofeedback studies for prosthetic hand control, since amputees lost part of their proprioception. This study explores whether it is possible to use audio feedback alone to convey more than one independent variable, without relying on visual input, to improve the learning of new perceptions; in this case artificial proprioception of a prosthetic hand. This way different features of the audio feedback can be observed to design applications capable of coupling different sensory inputs that will improve the acceptance and control of prosthetic devices. Experiments were conducted to determine whether the audio signals can be used as a multi-variable dynamical sensory substitution in reaching movements. The results showed that it is possible to use auditive feedback to create a body image without using the visual contact as a guide, thus to assist prosthetic hand users by internalizing new perceptions.","PeriodicalId":189213,"journal":{"name":"2009 IEEE International Conference on Rehabilitation Robotics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE International Conference on Rehabilitation Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR.2009.5209521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Visual input is one of the prerequisites for most biofeedback studies for prosthetic hand control, since amputees lost part of their proprioception. This study explores whether it is possible to use audio feedback alone to convey more than one independent variable, without relying on visual input, to improve the learning of new perceptions; in this case artificial proprioception of a prosthetic hand. This way different features of the audio feedback can be observed to design applications capable of coupling different sensory inputs that will improve the acceptance and control of prosthetic devices. Experiments were conducted to determine whether the audio signals can be used as a multi-variable dynamical sensory substitution in reaching movements. The results showed that it is possible to use auditive feedback to create a body image without using the visual contact as a guide, thus to assist prosthetic hand users by internalizing new perceptions.