SimPLE, a visuotactile method learned in simulation to precisely pick, localize, regrasp, and place objects

IF 26.1 1区 计算机科学 Q1 ROBOTICS
Maria Bauza, Antonia Bronars, Yifan Hou, Ian Taylor, Nikhil Chavan-Dafle, Alberto Rodriguez
{"title":"SimPLE, a visuotactile method learned in simulation to precisely pick, localize, regrasp, and place objects","authors":"Maria Bauza,&nbsp;Antonia Bronars,&nbsp;Yifan Hou,&nbsp;Ian Taylor,&nbsp;Nikhil Chavan-Dafle,&nbsp;Alberto Rodriguez","doi":"10.1126/scirobotics.adi8808","DOIUrl":null,"url":null,"abstract":"<div >Existing robotic systems have a tension between generality and precision. Deployed solutions for robotic manipulation tend to fall into the paradigm of one robot solving a single task, lacking “precise generalization,” or the ability to solve many tasks without compromising on precision. This paper explores solutions for precise and general pick and place. In precise pick and place, or kitting, the robot transforms an unstructured arrangement of objects into an organized arrangement, which can facilitate further manipulation. We propose SimPLE (Simulation to Pick Localize and placE) as a solution to precise pick and place. SimPLE learns to pick, regrasp, and place objects given the object’s computer-aided design model and no prior experience. We developed three main components: task-aware grasping, visuotactile perception, and regrasp planning. Task-aware grasping computes affordances of grasps that are stable, observable, and favorable to placing. The visuotactile perception model relies on matching real observations against a set of simulated ones through supervised learning to estimate a distribution of likely object poses. Last, we computed a multistep pick-and-place plan by solving a shortest-path problem on a graph of hand-to-hand regrasps. On a dual-arm robot equipped with visuotactile sensing, SimPLE demonstrated pick and place of 15 diverse objects. The objects spanned a wide range of shapes, and SimPLE achieved successful placements into structured arrangements with 1-mm clearance more than 90% of the time for six objects and more than 80% of the time for 11 objects.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 91","pages":""},"PeriodicalIF":26.1000,"publicationDate":"2024-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science Robotics","FirstCategoryId":"94","ListUrlMain":"https://www.science.org/doi/10.1126/scirobotics.adi8808","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Existing robotic systems have a tension between generality and precision. Deployed solutions for robotic manipulation tend to fall into the paradigm of one robot solving a single task, lacking “precise generalization,” or the ability to solve many tasks without compromising on precision. This paper explores solutions for precise and general pick and place. In precise pick and place, or kitting, the robot transforms an unstructured arrangement of objects into an organized arrangement, which can facilitate further manipulation. We propose SimPLE (Simulation to Pick Localize and placE) as a solution to precise pick and place. SimPLE learns to pick, regrasp, and place objects given the object’s computer-aided design model and no prior experience. We developed three main components: task-aware grasping, visuotactile perception, and regrasp planning. Task-aware grasping computes affordances of grasps that are stable, observable, and favorable to placing. The visuotactile perception model relies on matching real observations against a set of simulated ones through supervised learning to estimate a distribution of likely object poses. Last, we computed a multistep pick-and-place plan by solving a shortest-path problem on a graph of hand-to-hand regrasps. On a dual-arm robot equipped with visuotactile sensing, SimPLE demonstrated pick and place of 15 diverse objects. The objects spanned a wide range of shapes, and SimPLE achieved successful placements into structured arrangements with 1-mm clearance more than 90% of the time for six objects and more than 80% of the time for 11 objects.
SimPLE,一种在模拟中学习到的可视触觉方法,用于精确拾取、定位、重新抓取和放置物体。
现有的机器人系统在通用性和精确性之间存在矛盾。已部署的机器人操纵解决方案往往陷入一个机器人解决单一任务的模式,缺乏 "精确通用化 "或在不影响精度的情况下解决多项任务的能力。本文探讨了精确拾放和通用拾放的解决方案。在精确拾放或拼装过程中,机器人将无序排列的物体转化为有序排列,从而便于进一步操作。我们提出了 SimPLE(模拟拾取、定位和放置)作为精确拾放的解决方案。SimPLE 可在物体的计算机辅助设计模型和无经验的情况下学习拾取、重新抓取和放置物体。我们开发了三个主要组件:任务感知抓取、视觉触觉感知和再抓取规划。任务感知抓取可计算出稳定、可观察和有利于放置的抓取能力。视觉触觉感知模型依赖于通过监督学习将真实观察结果与一组模拟观察结果相匹配,从而估算出可能的物体姿势分布。最后,我们通过求解手与手之间重置图上的最短路径问题,计算出一个多步骤拾放计划。在一个配备了视觉传感功能的双臂机器人上,SimPLE 演示了 15 种不同物体的拾取和放置。这些物体形状各异,SimPLE 在 6 个物体上 90% 以上的时间内成功将其放置到间隙为 1 毫米的结构化排列中,在 11 个物体上 80% 以上的时间内成功将其放置到间隙为 1 毫米的结构化排列中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Science Robotics
Science Robotics Mathematics-Control and Optimization
CiteScore
30.60
自引率
2.80%
发文量
83
期刊介绍: Science Robotics publishes original, peer-reviewed, science- or engineering-based research articles that advance the field of robotics. The journal also features editor-commissioned Reviews. An international team of academic editors holds Science Robotics articles to the same high-quality standard that is the hallmark of the Science family of journals. Sub-topics include: actuators, advanced materials, artificial Intelligence, autonomous vehicles, bio-inspired design, exoskeletons, fabrication, field robotics, human-robot interaction, humanoids, industrial robotics, kinematics, machine learning, material science, medical technology, motion planning and control, micro- and nano-robotics, multi-robot control, sensors, service robotics, social and ethical issues, soft robotics, and space, planetary and undersea exploration.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信