NeuralGrasps: Learning Implicit Representations for Grasps of Multiple Robotic Hands

Ninad Khargonkar, Neil Song, Zesheng Xu, B. Prabhakaran, Yu Xiang
{"title":"NeuralGrasps: Learning Implicit Representations for Grasps of Multiple Robotic Hands","authors":"Ninad Khargonkar, Neil Song, Zesheng Xu, B. Prabhakaran, Yu Xiang","doi":"10.48550/arXiv.2207.02959","DOIUrl":null,"url":null,"abstract":"We introduce a neural implicit representation for grasps of objects from multiple robotic hands. Different grasps across multiple robotic hands are encoded into a shared latent space. Each latent vector is learned to decode to the 3D shape of an object and the 3D shape of a robotic hand in a grasping pose in terms of the signed distance functions of the two 3D shapes. In addition, the distance metric in the latent space is learned to preserve the similarity between grasps across different robotic hands, where the similarity of grasps is defined according to contact regions of the robotic hands. This property enables our method to transfer grasps between different grippers including a human hand, and grasp transfer has the potential to share grasping skills between robots and enable robots to learn grasping skills from humans. Furthermore, the encoded signed distance functions of objects and grasps in our implicit representation can be used for 6D object pose estimation with grasping contact optimization from partial point clouds, which enables robotic grasping in the real world.","PeriodicalId":273870,"journal":{"name":"Conference on Robot Learning","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Robot Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2207.02959","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

We introduce a neural implicit representation for grasps of objects from multiple robotic hands. Different grasps across multiple robotic hands are encoded into a shared latent space. Each latent vector is learned to decode to the 3D shape of an object and the 3D shape of a robotic hand in a grasping pose in terms of the signed distance functions of the two 3D shapes. In addition, the distance metric in the latent space is learned to preserve the similarity between grasps across different robotic hands, where the similarity of grasps is defined according to contact regions of the robotic hands. This property enables our method to transfer grasps between different grippers including a human hand, and grasp transfer has the potential to share grasping skills between robots and enable robots to learn grasping skills from humans. Furthermore, the encoded signed distance functions of objects and grasps in our implicit representation can be used for 6D object pose estimation with grasping contact optimization from partial point clouds, which enables robotic grasping in the real world.
NeuralGrasps:学习多机械手抓取的隐式表示
我们引入了一种神经隐式表示,用于从多个机械手抓取物体。多个机器人手的不同抓取被编码成一个共享的潜在空间。学习每个潜在向量,根据两个三维形状的符号距离函数解码为物体的三维形状和机械手在抓取姿势下的三维形状。此外,学习了隐空间中的距离度量,以保持不同机械手之间抓取的相似性,其中根据机械手的接触区域定义抓取的相似性。这一特性使我们的方法能够在包括人手在内的不同抓取器之间转移抓取,并且抓取转移有可能在机器人之间共享抓取技能,并使机器人能够从人类那里学习抓取技能。此外,在我们的隐式表示中,物体和抓取的编码符号距离函数可以用于基于部分点云的抓取接触优化的6D物体姿态估计,从而实现现实世界中的机器人抓取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信