基于多模态信息的机器人抓取结果预测

Chao Yang, Peng Du, F. Sun, Bin Fang, Jie Zhou
{"title":"基于多模态信息的机器人抓取结果预测","authors":"Chao Yang, Peng Du, F. Sun, Bin Fang, Jie Zhou","doi":"10.1109/ROBIO.2018.8665307","DOIUrl":null,"url":null,"abstract":"In the service robot application scenario, the stable grasp requires careful balancing the contact forces and the property of the manipulation objects, such as shape, weight. Deducing whether a particular grasp would be stable from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through tactile sensor provides an appealing avenue toward more successful and consistent robotic grasping. Other than this, an object's shape and weight would also decide whether to grasping stabilize or not. In this work, we investigate the question of whether tactile information and object intrinsic property aid in predicting grasp outcomes within a multi-modal sensing framework that combines vision, tactile and object intrinsic property. To that end, we collected more than 2550 grasping trials using a 3-finger robot hand which mounted with multiple tactile sensors. We evaluated our multi-modal deep neural network models to directly predict grasp stability from either modality individually or multimodal modalities. Our experimental results indicate the visual combination of tactile readings and intrinsic properties of the object significantly improve grasping prediction performance.","PeriodicalId":417415,"journal":{"name":"2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Predict Robot Grasp Outcomes based on Multi-Modal Information\",\"authors\":\"Chao Yang, Peng Du, F. Sun, Bin Fang, Jie Zhou\",\"doi\":\"10.1109/ROBIO.2018.8665307\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the service robot application scenario, the stable grasp requires careful balancing the contact forces and the property of the manipulation objects, such as shape, weight. Deducing whether a particular grasp would be stable from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through tactile sensor provides an appealing avenue toward more successful and consistent robotic grasping. Other than this, an object's shape and weight would also decide whether to grasping stabilize or not. In this work, we investigate the question of whether tactile information and object intrinsic property aid in predicting grasp outcomes within a multi-modal sensing framework that combines vision, tactile and object intrinsic property. To that end, we collected more than 2550 grasping trials using a 3-finger robot hand which mounted with multiple tactile sensors. We evaluated our multi-modal deep neural network models to directly predict grasp stability from either modality individually or multimodal modalities. Our experimental results indicate the visual combination of tactile readings and intrinsic properties of the object significantly improve grasping prediction performance.\",\"PeriodicalId\":417415,\"journal\":{\"name\":\"2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"volume\":\"56 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROBIO.2018.8665307\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO.2018.8665307","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

在服务机器人应用场景中,稳定的抓取需要仔细平衡接触力和操作对象的形状、重量等特性。因此,通过间接测量(如视觉)来推断特定的抓取是否稳定是相当具有挑战性的,而通过触觉传感器直接感知接触为更成功和一致的机器人抓取提供了一条吸引人的途径。除此之外,物体的形状和重量也会决定抓取是否稳定。在这项工作中,我们研究了触觉信息和物体固有属性是否有助于在结合视觉、触觉和物体固有属性的多模态传感框架中预测抓取结果的问题。为此,我们使用安装了多个触觉传感器的三指机器人手收集了2550多次抓取试验。我们评估了我们的多模态深度神经网络模型,以直接预测单模态或多模态的抓取稳定性。我们的实验结果表明,触觉读数和物体固有属性的视觉组合显著提高了抓取预测性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Predict Robot Grasp Outcomes based on Multi-Modal Information
In the service robot application scenario, the stable grasp requires careful balancing the contact forces and the property of the manipulation objects, such as shape, weight. Deducing whether a particular grasp would be stable from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through tactile sensor provides an appealing avenue toward more successful and consistent robotic grasping. Other than this, an object's shape and weight would also decide whether to grasping stabilize or not. In this work, we investigate the question of whether tactile information and object intrinsic property aid in predicting grasp outcomes within a multi-modal sensing framework that combines vision, tactile and object intrinsic property. To that end, we collected more than 2550 grasping trials using a 3-finger robot hand which mounted with multiple tactile sensors. We evaluated our multi-modal deep neural network models to directly predict grasp stability from either modality individually or multimodal modalities. Our experimental results indicate the visual combination of tactile readings and intrinsic properties of the object significantly improve grasping prediction performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信