{"title":"High Resolution Soft Tactile Interface for Physical Human-Robot Interaction","authors":"Isabella Huang, R. Bajcsy","doi":"10.1109/ICRA40945.2020.9197365","DOIUrl":null,"url":null,"abstract":"If robots and humans are to coexist and cooperate in society, it would be useful for robots to be able to engage in tactile interactions. Touch is an intuitive communication tool as well as a fundamental method by which we assist each other physically. Tactile abilities are challenging to engineer in robots, since both mechanical safety and sensory intelligence are imperative. Existing work reveals a trade-off between these principles— tactile interfaces that are high in resolution are not easily adapted to human-sized geometries, nor are they generally compliant enough to guarantee safety. On the other hand, soft tactile interfaces deliver intrinsically safe mechanical properties, but their non-linear characteristics render them difficult for use in timely sensing and control. We propose a robotic system that is equipped with a completely soft and therefore safe tactile interface that is large enough to interact with human upper limbs, while producing high resolution tactile sensory readings via depth camera imaging of the soft interface. We present and validate a data-driven model that maps point cloud data to contact forces, and verify its efficacy by demonstrating two real-world applications. In particular, the robot is able to react to a human finger’s pokes and change its pose based on the tactile input. In addition, we also demonstrate that the robot can act as an assistive device that dynamically supports and follows a human forearm from underneath.","PeriodicalId":6859,"journal":{"name":"2020 IEEE International Conference on Robotics and Automation (ICRA)","volume":"21 1","pages":"1705-1711"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA40945.2020.9197365","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
If robots and humans are to coexist and cooperate in society, it would be useful for robots to be able to engage in tactile interactions. Touch is an intuitive communication tool as well as a fundamental method by which we assist each other physically. Tactile abilities are challenging to engineer in robots, since both mechanical safety and sensory intelligence are imperative. Existing work reveals a trade-off between these principles— tactile interfaces that are high in resolution are not easily adapted to human-sized geometries, nor are they generally compliant enough to guarantee safety. On the other hand, soft tactile interfaces deliver intrinsically safe mechanical properties, but their non-linear characteristics render them difficult for use in timely sensing and control. We propose a robotic system that is equipped with a completely soft and therefore safe tactile interface that is large enough to interact with human upper limbs, while producing high resolution tactile sensory readings via depth camera imaging of the soft interface. We present and validate a data-driven model that maps point cloud data to contact forces, and verify its efficacy by demonstrating two real-world applications. In particular, the robot is able to react to a human finger’s pokes and change its pose based on the tactile input. In addition, we also demonstrate that the robot can act as an assistive device that dynamically supports and follows a human forearm from underneath.