Segmentation of unknown objects in indoor environments

A. Richtsfeld, Thomas Morwald, J. Prankl, M. Zillich, M. Vincze
{"title":"Segmentation of unknown objects in indoor environments","authors":"A. Richtsfeld, Thomas Morwald, J. Prankl, M. Zillich, M. Vincze","doi":"10.1109/IROS.2012.6385661","DOIUrl":null,"url":null,"abstract":"We present a framework for segmenting unknown objects in RGB-D images suitable for robotics tasks such as object search, grasping and manipulation. While handling single objects on a table is solved, handling complex scenes poses considerable problems due to clutter and occlusion. After pre-segmentation of the input image based on surface normals, surface patches are estimated using a mixture of planes and NURBS (non-uniform rational B-splines) and model selection is employed to find the best representation for the given data. We then construct a graph from surface patches and relations between pairs of patches and perform graph cut to arrive at object hypotheses segmented from the scene. The energy terms for patch relations are learned from user annotated training data, where support vector machines (SVM) are trained to classify a relation as being indicative of two patches belonging to the same object. We show evaluation of the relations and results on a database of different test sets, demonstrating that the approach can segment objects of various shapes in cluttered table top scenes.","PeriodicalId":6358,"journal":{"name":"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"os-39 1","pages":"4791-4796"},"PeriodicalIF":0.0000,"publicationDate":"2012-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"172","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE/RSJ International Conference on Intelligent Robots and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2012.6385661","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 172

Abstract

We present a framework for segmenting unknown objects in RGB-D images suitable for robotics tasks such as object search, grasping and manipulation. While handling single objects on a table is solved, handling complex scenes poses considerable problems due to clutter and occlusion. After pre-segmentation of the input image based on surface normals, surface patches are estimated using a mixture of planes and NURBS (non-uniform rational B-splines) and model selection is employed to find the best representation for the given data. We then construct a graph from surface patches and relations between pairs of patches and perform graph cut to arrive at object hypotheses segmented from the scene. The energy terms for patch relations are learned from user annotated training data, where support vector machines (SVM) are trained to classify a relation as being indicative of two patches belonging to the same object. We show evaluation of the relations and results on a database of different test sets, demonstrating that the approach can segment objects of various shapes in cluttered table top scenes.
室内环境中未知物体的分割
我们提出了一个框架,用于分割RGB-D图像中的未知物体,适用于机器人任务,如物体搜索,抓取和操作。虽然处理一个表上的单个对象得到了解决,但处理复杂的场景由于杂乱和遮挡会带来相当大的问题。在基于表面法线对输入图像进行预分割后,使用平面和NURBS(非均匀有理b样条)的混合来估计表面斑块,并使用模型选择来找到给定数据的最佳表示。然后,我们从表面斑块和斑块对之间的关系构造一个图,并执行图切,以得到从场景中分割出来的对象假设。补丁关系的能量项从用户标注的训练数据中学习,其中训练支持向量机(SVM)将关系分类为属于同一对象的两个补丁的指示。我们展示了在不同测试集的数据库上对关系和结果的评估,表明该方法可以在杂乱的桌面场景中分割各种形状的对象。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信