基于图重用的k-NN搜索的拾取机器人目标姿态估计加速技术

Atsutake Kosuge, T. Oshima
{"title":"基于图重用的k-NN搜索的拾取机器人目标姿态估计加速技术","authors":"Atsutake Kosuge, T. Oshima","doi":"10.1109/GC46384.2019.00018","DOIUrl":null,"url":null,"abstract":"An object-pose estimation acceleration technique for picking robot applications by using hierarchical-graph-reusing k-nearest-neighbor search (k-NN) has been developed. The conventional picking robots suffered from low picking throughput due to a large amount of computation of the object-pose estimation, especially the one for k-NN search, which determines plural neighboring points for every data point. To accelerate the k-NN search, this work introduces a hierarchical graph to the object-pose estimation for the first time instead of a conventional K-D tree since the former enables simultaneous acquisition of plural neighboring points. To save generation time of the hierarchical graph, a reuse of the generated graph is also proposed. Experiments of the proposed accelerating technique using Amazon Picking Contest data sets and Arm Cortex-A53 CPU have confirmed that the object-pose estimation takes 1.1 seconds (improved by a factor of 2.6), and the entire picking process (image recognition, object-pose estimation, and motion planning) takes 2.5 seconds (improved by a factor of 1.7).","PeriodicalId":129268,"journal":{"name":"2019 First International Conference on Graph Computing (GC)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"An Object-Pose Estimation Acceleration Technique for Picking Robot Applications by Using Graph-Reusing k-NN Search\",\"authors\":\"Atsutake Kosuge, T. Oshima\",\"doi\":\"10.1109/GC46384.2019.00018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An object-pose estimation acceleration technique for picking robot applications by using hierarchical-graph-reusing k-nearest-neighbor search (k-NN) has been developed. The conventional picking robots suffered from low picking throughput due to a large amount of computation of the object-pose estimation, especially the one for k-NN search, which determines plural neighboring points for every data point. To accelerate the k-NN search, this work introduces a hierarchical graph to the object-pose estimation for the first time instead of a conventional K-D tree since the former enables simultaneous acquisition of plural neighboring points. To save generation time of the hierarchical graph, a reuse of the generated graph is also proposed. Experiments of the proposed accelerating technique using Amazon Picking Contest data sets and Arm Cortex-A53 CPU have confirmed that the object-pose estimation takes 1.1 seconds (improved by a factor of 2.6), and the entire picking process (image recognition, object-pose estimation, and motion planning) takes 2.5 seconds (improved by a factor of 1.7).\",\"PeriodicalId\":129268,\"journal\":{\"name\":\"2019 First International Conference on Graph Computing (GC)\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 First International Conference on Graph Computing (GC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GC46384.2019.00018\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 First International Conference on Graph Computing (GC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GC46384.2019.00018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

提出了一种基于分层图重用的k-最近邻搜索(k-NN)的目标姿态估计加速技术。传统的拾取机器人由于物体姿态估计的计算量大,特别是k-NN搜索的计算量大,需要为每个数据点确定多个相邻点,导致拾取吞吐量低。为了加速k-NN搜索,这项工作首次将层次图引入到目标姿态估计中,而不是传统的K-D树,因为前者可以同时获取多个相邻点。为了节省分层图的生成时间,还提出了生成图的重用方法。使用Amazon pick Contest数据集和Arm Cortex-A53 CPU进行的加速实验证实,物体姿态估计耗时1.1秒(提高了2.6倍),整个拾取过程(图像识别、物体姿态估计和运动规划)耗时2.5秒(提高了1.7倍)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An Object-Pose Estimation Acceleration Technique for Picking Robot Applications by Using Graph-Reusing k-NN Search
An object-pose estimation acceleration technique for picking robot applications by using hierarchical-graph-reusing k-nearest-neighbor search (k-NN) has been developed. The conventional picking robots suffered from low picking throughput due to a large amount of computation of the object-pose estimation, especially the one for k-NN search, which determines plural neighboring points for every data point. To accelerate the k-NN search, this work introduces a hierarchical graph to the object-pose estimation for the first time instead of a conventional K-D tree since the former enables simultaneous acquisition of plural neighboring points. To save generation time of the hierarchical graph, a reuse of the generated graph is also proposed. Experiments of the proposed accelerating technique using Amazon Picking Contest data sets and Arm Cortex-A53 CPU have confirmed that the object-pose estimation takes 1.1 seconds (improved by a factor of 2.6), and the entire picking process (image recognition, object-pose estimation, and motion planning) takes 2.5 seconds (improved by a factor of 1.7).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信