Autonomous learning-free grasping and robot-to-robot handover of unknown objects

IF 4.3 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yuwei Wu, Wanze Li, Zhiyang Liu, Weixiao Liu, Gregory S. Chirikjian
{"title":"Autonomous learning-free grasping and robot-to-robot handover of unknown objects","authors":"Yuwei Wu,&nbsp;Wanze Li,&nbsp;Zhiyang Liu,&nbsp;Weixiao Liu,&nbsp;Gregory S. Chirikjian","doi":"10.1007/s10514-025-10201-y","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we propose a learning-free approach for an autonomous robotic system to grasp, hand over, and regrasp previously unseen objects. The proposed framework includes two main components: a novel grasping detector to predict grasping poses directly from the point cloud and a reachability-aware handover planner to select the exchange pose and grasping poses for two robots. In the grasping detection stage, multiple superquadrics are first recovered at different positions within the object, representing the local geometric feature of the object. Our algorithm then exploits the tri-symmetry feature of superquadrics and synthesizes a list of antipodal grasps from each recovered superquadric. An evaluation model is designed to assess and quantify the quality of each grasp candidate. In the handover planning stage, the planner first selects grasping candidates that have high scores and a larger number of collision-free partners. Then the exchange location is computed by utilizing two signed distance fields (SDF) which model the reachability space for the pair of two robots. To evaluate the performance of the proposed method, we first run experiments on isolated and packed scenes to corroborate the effectiveness of our grasping detection method. Then the handover experiments are conducted on a dual-arm system with two 7 degrees of freedom (DoF) manipulators. The results indicate that our method shows better performance compared with the state-of-the-art, without the need for large amounts of training.</p></div>","PeriodicalId":55409,"journal":{"name":"Autonomous Robots","volume":"49 3","pages":""},"PeriodicalIF":4.3000,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10514-025-10201-y.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Autonomous Robots","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10514-025-10201-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we propose a learning-free approach for an autonomous robotic system to grasp, hand over, and regrasp previously unseen objects. The proposed framework includes two main components: a novel grasping detector to predict grasping poses directly from the point cloud and a reachability-aware handover planner to select the exchange pose and grasping poses for two robots. In the grasping detection stage, multiple superquadrics are first recovered at different positions within the object, representing the local geometric feature of the object. Our algorithm then exploits the tri-symmetry feature of superquadrics and synthesizes a list of antipodal grasps from each recovered superquadric. An evaluation model is designed to assess and quantify the quality of each grasp candidate. In the handover planning stage, the planner first selects grasping candidates that have high scores and a larger number of collision-free partners. Then the exchange location is computed by utilizing two signed distance fields (SDF) which model the reachability space for the pair of two robots. To evaluate the performance of the proposed method, we first run experiments on isolated and packed scenes to corroborate the effectiveness of our grasping detection method. Then the handover experiments are conducted on a dual-arm system with two 7 degrees of freedom (DoF) manipulators. The results indicate that our method shows better performance compared with the state-of-the-art, without the need for large amounts of training.

自主学习抓取和机器人对未知物体的切换
在本文中,我们提出了一种无需学习的方法,用于自主机器人系统抓取、移交和重新抓取以前看不见的物体。该框架包括两个主要组成部分:一种新型抓取检测器,用于直接从点云预测抓取姿态;另一种可达性感知切换规划器用于选择两个机器人的交换姿态和抓取姿态。在抓取检测阶段,首先在物体内部的不同位置恢复多个超二次曲面,代表物体的局部几何特征。然后,我们的算法利用超二次曲面的三对称特征,并从每个恢复的超二次曲面合成对映抓取列表。设计了一个评估模型来评估和量化每个硕士候选人的质量。在交接规划阶段,规划者首先选择得分高、无碰撞伙伴数量多的抓取对象。然后利用两个有符号距离域(SDF)计算交换位置,SDF对两个机器人的可达空间进行建模。为了评估所提出方法的性能,我们首先在孤立和拥挤的场景上进行实验,以证实我们的抓取检测方法的有效性。然后,在具有两个7自由度机械臂的双臂系统上进行了切换实验。结果表明,我们的方法在不需要大量训练的情况下,表现出比最先进的方法更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Autonomous Robots
Autonomous Robots 工程技术-机器人学
CiteScore
7.90
自引率
5.70%
发文量
46
审稿时长
3 months
期刊介绍: Autonomous Robots reports on the theory and applications of robotic systems capable of some degree of self-sufficiency. It features papers that include performance data on actual robots in the real world. Coverage includes: control of autonomous robots · real-time vision · autonomous wheeled and tracked vehicles · legged vehicles · computational architectures for autonomous systems · distributed architectures for learning, control and adaptation · studies of autonomous robot systems · sensor fusion · theory of autonomous systems · terrain mapping and recognition · self-calibration and self-repair for robots · self-reproducing intelligent structures · genetic algorithms as models for robot development. The focus is on the ability to move and be self-sufficient, not on whether the system is an imitation of biology. Of course, biological models for robotic systems are of major interest to the journal since living systems are prototypes for autonomous behavior.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信