Anchor-Based 6D Object Pose Estimation

Zehao Liu, Hao Wang, Fuchang Liu
{"title":"Anchor-Based 6D Object Pose Estimation","authors":"Zehao Liu, Hao Wang, Fuchang Liu","doi":"10.1109/ICVR51878.2021.9483838","DOIUrl":null,"url":null,"abstract":"Estimating the 6D pose of known objects is an important task for many robotic and AR applications. The most recent trend in 6D pose estimation has been to take advantage of deep networks to either directly regress the pose from the image or to first predict the 2D locations of 3D keypoints and then recover the pose according to the 2D-3D coordinates relationship of keypoints using a PnP algorithm. However, most of these methods are vulnerable to occlusions. In this paper, we present a simple but efficient method to estimate 6D pose of objects using anchor-based corner detection, based on two-stage detection backbone (i.e. Faster R-CNN Ren et al. (2015)). Instead of directly predicting two-dimensional coordinates of the projected 3D bounding box (i.e. corners), we regress relative coordinates of top corners based on the two-dimensional anchor and diagonals of corresponding faces. Bottom corners are further robustly inferred using geometrical constraints of face diagonals. Experiments show that our method achieves significant improvement on LineMod S. Hinterstoisser et al. (2012) and Occlusion Brachmann et al. (2014a) dataset, outperforming most existing 6D pose estimation methods without using refinement.","PeriodicalId":266506,"journal":{"name":"2021 IEEE 7th International Conference on Virtual Reality (ICVR)","volume":"43 11","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 7th International Conference on Virtual Reality (ICVR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICVR51878.2021.9483838","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Estimating the 6D pose of known objects is an important task for many robotic and AR applications. The most recent trend in 6D pose estimation has been to take advantage of deep networks to either directly regress the pose from the image or to first predict the 2D locations of 3D keypoints and then recover the pose according to the 2D-3D coordinates relationship of keypoints using a PnP algorithm. However, most of these methods are vulnerable to occlusions. In this paper, we present a simple but efficient method to estimate 6D pose of objects using anchor-based corner detection, based on two-stage detection backbone (i.e. Faster R-CNN Ren et al. (2015)). Instead of directly predicting two-dimensional coordinates of the projected 3D bounding box (i.e. corners), we regress relative coordinates of top corners based on the two-dimensional anchor and diagonals of corresponding faces. Bottom corners are further robustly inferred using geometrical constraints of face diagonals. Experiments show that our method achieves significant improvement on LineMod S. Hinterstoisser et al. (2012) and Occlusion Brachmann et al. (2014a) dataset, outperforming most existing 6D pose estimation methods without using refinement.
基于锚点的6D目标姿态估计
估计已知物体的6D姿态是许多机器人和AR应用的重要任务。6D姿态估计的最新趋势是利用深度网络直接从图像中回归姿态,或者首先预测3D关键点的2D位置,然后根据关键点的2D-3D坐标关系使用PnP算法恢复姿态。然而,这些方法大多容易闭塞。在本文中,我们提出了一种简单而有效的方法,基于两阶段检测骨干(即Faster R-CNN Ren et al.(2015)),使用基于锚点的角点检测来估计物体的6D姿态。我们不是直接预测投影的三维边界框(即角)的二维坐标,而是根据对应面的二维锚点和对角线回归顶角的相对坐标。利用面对角线的几何约束进一步稳健地推断底角。实验表明,我们的方法在LineMod S. Hinterstoisser等人(2012)和Occlusion Brachmann等人(2014a)数据集上取得了显著的改进,在不使用改进的情况下优于大多数现有的6D姿态估计方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信