Fast UOIS: Unseen Object Instance Segmentation with Adaptive Clustering for Industrial Robotic Grasping

IF 2.2 3区 工程技术 Q2 ENGINEERING, MECHANICAL
Actuators Pub Date : 2024-08-09 DOI:10.3390/act13080305
Kui Fu, X. Dang, Qingyu Zhang, Jiansheng Peng
{"title":"Fast UOIS: Unseen Object Instance Segmentation with Adaptive Clustering for Industrial Robotic Grasping","authors":"Kui Fu, X. Dang, Qingyu Zhang, Jiansheng Peng","doi":"10.3390/act13080305","DOIUrl":null,"url":null,"abstract":"Segmenting unseen object instances in unstructured environments is an important skill for robots to perform grasping-related tasks, where the trade-off between efficiency and accuracy is an urgent challenge to be solved. In this work, we propose a fast unseen object instance segmentation (Fast UOIS) method that utilizes predicted center offsets of objects to compute the positions of local maxima and minima, which are then used for selecting initial seed points required by the mean-shift clustering algorithm. This clustering algorithm that adaptively generates seed points can quickly and accurately obtain instance masks of unseen objects. Accordingly, Fast UOIS first generates pixel-wise predictions of object classes and center offsets from synthetic depth images. Then, these predictions are used by the clustering algorithm to calculate initial seed points and to find possible object instances. Finally, the depth information corresponding to the filtered instance masks is fed into the grasp generation network to generate grasp poses. Benchmark experiments show that our method can be well transferred to the real world and can quickly generate sharp and accurate instance masks. Furthermore, we demonstrate that our method is capable of segmenting instance masks of unseen objects for robotic grasping.","PeriodicalId":48584,"journal":{"name":"Actuators","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Actuators","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/act13080305","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MECHANICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Segmenting unseen object instances in unstructured environments is an important skill for robots to perform grasping-related tasks, where the trade-off between efficiency and accuracy is an urgent challenge to be solved. In this work, we propose a fast unseen object instance segmentation (Fast UOIS) method that utilizes predicted center offsets of objects to compute the positions of local maxima and minima, which are then used for selecting initial seed points required by the mean-shift clustering algorithm. This clustering algorithm that adaptively generates seed points can quickly and accurately obtain instance masks of unseen objects. Accordingly, Fast UOIS first generates pixel-wise predictions of object classes and center offsets from synthetic depth images. Then, these predictions are used by the clustering algorithm to calculate initial seed points and to find possible object instances. Finally, the depth information corresponding to the filtered instance masks is fed into the grasp generation network to generate grasp poses. Benchmark experiments show that our method can be well transferred to the real world and can quickly generate sharp and accurate instance masks. Furthermore, we demonstrate that our method is capable of segmenting instance masks of unseen objects for robotic grasping.
快速 UOIS:利用自适应聚类进行未见物体实例分割,用于工业机器人抓取
在非结构化环境中分割未见物体实例是机器人执行抓取相关任务的一项重要技能,如何权衡效率与准确性是亟待解决的难题。在这项工作中,我们提出了一种快速未见物体实例分割(Fast UOIS)方法,该方法利用预测的物体中心偏移计算局部最大值和最小值的位置,然后用于选择均值移动聚类算法所需的初始种子点。这种自适应生成种子点的聚类算法可以快速、准确地获得未见物体的实例掩码。因此,快速 UOIS 首先从合成深度图像中生成物体类别和中心偏移的像素预测。然后,聚类算法利用这些预测结果计算初始种子点,并找到可能的物体实例。最后,与过滤后的实例掩码相对应的深度信息被输入抓取生成网络,以生成抓取姿势。基准实验表明,我们的方法可以很好地应用于现实世界,并能快速生成清晰准确的实例掩模。此外,我们还证明了我们的方法能够分割未见物体的实例掩模,用于机器人抓取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Actuators
Actuators Mathematics-Control and Optimization
CiteScore
3.90
自引率
15.40%
发文量
315
审稿时长
11 weeks
期刊介绍: Actuators (ISSN 2076-0825; CODEN: ACTUC3) is an international open access journal on the science and technology of actuators and control systems published quarterly online by MDPI.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信