用于上肢假肢控制的神经形态视觉与触觉融合。

Mark Hays, Luke Osborn, Rohan Ghosh, Mark Iskarous, Christopher Hunt, Nitish V Thakor
{"title":"用于上肢假肢控制的神经形态视觉与触觉融合。","authors":"Mark Hays, Luke Osborn, Rohan Ghosh, Mark Iskarous, Christopher Hunt, Nitish V Thakor","doi":"10.1109/ner.2019.8716890","DOIUrl":null,"url":null,"abstract":"<p><p>A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.</p>","PeriodicalId":73414,"journal":{"name":"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering","volume":"2019 ","pages":"981-984"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053366/pdf/nihms-1690724.pdf","citationCount":"0","resultStr":"{\"title\":\"Neuromorphic vision and tactile fusion for upper limb prosthesis control.\",\"authors\":\"Mark Hays, Luke Osborn, Rohan Ghosh, Mark Iskarous, Christopher Hunt, Nitish V Thakor\",\"doi\":\"10.1109/ner.2019.8716890\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.</p>\",\"PeriodicalId\":73414,\"journal\":{\"name\":\"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering\",\"volume\":\"2019 \",\"pages\":\"981-984\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053366/pdf/nihms-1690724.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ner.2019.8716890\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2019/5/20 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International IEEE/EMBS Conference on Neural Engineering : [proceedings]. International IEEE EMBS Conference on Neural Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ner.2019.8716890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2019/5/20 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

上肢假肢的一个主要问题是使用者感知的信息与假肢感知的信息之间的脱节。假肢技术的进步引入了用于监测抓取活动的触觉信息,但视觉信息作为人类感官系统的重要组成部分,仍未被充分利用作为对假肢的反馈形式。对于健全人来说,抓取或操纵物体的许多决定,如手的方向和孔径,都是在接触物体之前根据视觉信息做出的。我们的研究表明,神经形态视觉信息与触觉反馈相结合,可以提高健全人和截肢者拾取和操作日常物品的能力和效率。我们发现,与使用单一反馈方式相比,在实时闭环反馈策略中结合视觉和触觉信息可普遍缩短拾取和操作物品任务的完成时间。虽然视觉分类系统的实验误差掩盖了综合反馈的全部优点,但我们证明了这种融合视觉和触觉传感器神经形态信号的方法可以为假肢提供有价值的反馈,从而增强实时功能和可用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Neuromorphic vision and tactile fusion for upper limb prosthesis control.

Neuromorphic vision and tactile fusion for upper limb prosthesis control.

Neuromorphic vision and tactile fusion for upper limb prosthesis control.

Neuromorphic vision and tactile fusion for upper limb prosthesis control.

A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects. We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信