Mark Hays, Luke Osborn, Rohan Ghosh, Mark Iskarous, Christopher Hunt, Nitish V Thakor
{"title":"Neuromorphic vision and tactile fusion for upper limb prosthesis control.","authors":"Mark Hays, Luke Osborn, Rohan Ghosh, Mark Iskarous, Christopher Hunt, Nitish V Thakor","doi":"10.1109/ner.2019.8716890","DOIUrl":null,"url":null,"abstract":"<p><p>A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.</p>","PeriodicalId":73414,"journal":{"name":"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering","volume":"2019 ","pages":"981-984"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053366/pdf/nihms-1690724.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ner.2019.8716890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2019/5/20 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.