用于人机物理交互的高分辨率软触觉界面

Isabella Huang, R. Bajcsy
{"title":"用于人机物理交互的高分辨率软触觉界面","authors":"Isabella Huang, R. Bajcsy","doi":"10.1109/ICRA40945.2020.9197365","DOIUrl":null,"url":null,"abstract":"If robots and humans are to coexist and cooperate in society, it would be useful for robots to be able to engage in tactile interactions. Touch is an intuitive communication tool as well as a fundamental method by which we assist each other physically. Tactile abilities are challenging to engineer in robots, since both mechanical safety and sensory intelligence are imperative. Existing work reveals a trade-off between these principles— tactile interfaces that are high in resolution are not easily adapted to human-sized geometries, nor are they generally compliant enough to guarantee safety. On the other hand, soft tactile interfaces deliver intrinsically safe mechanical properties, but their non-linear characteristics render them difficult for use in timely sensing and control. We propose a robotic system that is equipped with a completely soft and therefore safe tactile interface that is large enough to interact with human upper limbs, while producing high resolution tactile sensory readings via depth camera imaging of the soft interface. We present and validate a data-driven model that maps point cloud data to contact forces, and verify its efficacy by demonstrating two real-world applications. In particular, the robot is able to react to a human finger’s pokes and change its pose based on the tactile input. In addition, we also demonstrate that the robot can act as an assistive device that dynamically supports and follows a human forearm from underneath.","PeriodicalId":6859,"journal":{"name":"2020 IEEE International Conference on Robotics and Automation (ICRA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"High Resolution Soft Tactile Interface for Physical Human-Robot Interaction\",\"authors\":\"Isabella Huang, R. Bajcsy\",\"doi\":\"10.1109/ICRA40945.2020.9197365\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"If robots and humans are to coexist and cooperate in society, it would be useful for robots to be able to engage in tactile interactions. Touch is an intuitive communication tool as well as a fundamental method by which we assist each other physically. Tactile abilities are challenging to engineer in robots, since both mechanical safety and sensory intelligence are imperative. Existing work reveals a trade-off between these principles— tactile interfaces that are high in resolution are not easily adapted to human-sized geometries, nor are they generally compliant enough to guarantee safety. On the other hand, soft tactile interfaces deliver intrinsically safe mechanical properties, but their non-linear characteristics render them difficult for use in timely sensing and control. We propose a robotic system that is equipped with a completely soft and therefore safe tactile interface that is large enough to interact with human upper limbs, while producing high resolution tactile sensory readings via depth camera imaging of the soft interface. We present and validate a data-driven model that maps point cloud data to contact forces, and verify its efficacy by demonstrating two real-world applications. In particular, the robot is able to react to a human finger’s pokes and change its pose based on the tactile input. In addition, we also demonstrate that the robot can act as an assistive device that dynamically supports and follows a human forearm from underneath.\",\"PeriodicalId\":6859,\"journal\":{\"name\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRA40945.2020.9197365\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA40945.2020.9197365","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

如果机器人和人类要在社会中共存和合作,那么机器人能够参与触觉互动将是有用的。触摸是一种直观的交流工具,也是我们在身体上相互帮助的基本方法。机器人的触觉能力对工程师来说是一个挑战,因为机械安全和感官智能都是必不可少的。现有的工作揭示了这些原则之间的权衡——高分辨率的触觉界面不容易适应人类大小的几何形状,它们通常也不够兼容,无法保证安全。另一方面,软触觉界面具有本质上安全的机械性能,但其非线性特性使其难以用于及时的传感和控制。我们提出了一种机器人系统,它配备了一个完全柔软的、因此安全的触觉界面,该界面足够大,可以与人类上肢互动,同时通过对软界面的深度相机成像产生高分辨率的触觉感官读数。我们提出并验证了一个数据驱动的模型,该模型将点云数据映射到接触力,并通过演示两个实际应用来验证其有效性。特别是,机器人能够对人类手指的戳动做出反应,并根据触觉输入改变姿势。此外,我们还证明了机器人可以作为一个辅助装置,从下面动态支持和跟随人类前臂。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
High Resolution Soft Tactile Interface for Physical Human-Robot Interaction
If robots and humans are to coexist and cooperate in society, it would be useful for robots to be able to engage in tactile interactions. Touch is an intuitive communication tool as well as a fundamental method by which we assist each other physically. Tactile abilities are challenging to engineer in robots, since both mechanical safety and sensory intelligence are imperative. Existing work reveals a trade-off between these principles— tactile interfaces that are high in resolution are not easily adapted to human-sized geometries, nor are they generally compliant enough to guarantee safety. On the other hand, soft tactile interfaces deliver intrinsically safe mechanical properties, but their non-linear characteristics render them difficult for use in timely sensing and control. We propose a robotic system that is equipped with a completely soft and therefore safe tactile interface that is large enough to interact with human upper limbs, while producing high resolution tactile sensory readings via depth camera imaging of the soft interface. We present and validate a data-driven model that maps point cloud data to contact forces, and verify its efficacy by demonstrating two real-world applications. In particular, the robot is able to react to a human finger’s pokes and change its pose based on the tactile input. In addition, we also demonstrate that the robot can act as an assistive device that dynamically supports and follows a human forearm from underneath.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信