High-accuracy 6-D pose measurement method for 3C thin parts in robotic assembly by monocular vision

IF 4.6 2区 物理与天体物理 Q1 OPTICS
Bin Wang, Jiwen Zhang, Song Wang, Dan Wu
{"title":"High-accuracy 6-D pose measurement method for 3C thin parts in robotic assembly by monocular vision","authors":"Bin Wang,&nbsp;Jiwen Zhang,&nbsp;Song Wang,&nbsp;Dan Wu","doi":"10.1016/j.optlastec.2024.111937","DOIUrl":null,"url":null,"abstract":"<div><div>The 6-D pose measurement of the 3C thin parts is essential for alignment process in robotic assembly. Due to the thin thickness, features from two surfaces are coupled together and difficult to distinguish in the image, causing the decrease in the accuracy of the pose measurement. In this study, a 6-D pose measurement method based on the monocular vision is proposed. An algorithm named dual-surface reprojection contour error optimization (DSRCEO) is proposed to simultaneously optimize the features from both two surfaces to improve the accuracy of pose measurement. In the DSRCEO, the computational domain and the image domain are constructed, and then an error index of DSRCE is derived by comprehensive consideration of the two domains to evaluate the quality of the current pose estimation. By minimizing the DSRCE, the initial estimation of the 6-D pose is iteratively optimized to continuously improve the measurement accuracy. The framework of the DSRCEO is available for the thin parts with various shapes, and specific algorithm implementations for the two most common shapes (trimmed circle and polygon) are derived. Finally, the accuracy and applicability of the proposed algorithm are verified through both sufficient simulation and experiments.</div></div>","PeriodicalId":19511,"journal":{"name":"Optics and Laser Technology","volume":"181 ","pages":"Article 111937"},"PeriodicalIF":4.6000,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics and Laser Technology","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0030399224013951","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0

Abstract

The 6-D pose measurement of the 3C thin parts is essential for alignment process in robotic assembly. Due to the thin thickness, features from two surfaces are coupled together and difficult to distinguish in the image, causing the decrease in the accuracy of the pose measurement. In this study, a 6-D pose measurement method based on the monocular vision is proposed. An algorithm named dual-surface reprojection contour error optimization (DSRCEO) is proposed to simultaneously optimize the features from both two surfaces to improve the accuracy of pose measurement. In the DSRCEO, the computational domain and the image domain are constructed, and then an error index of DSRCE is derived by comprehensive consideration of the two domains to evaluate the quality of the current pose estimation. By minimizing the DSRCE, the initial estimation of the 6-D pose is iteratively optimized to continuously improve the measurement accuracy. The framework of the DSRCEO is available for the thin parts with various shapes, and specific algorithm implementations for the two most common shapes (trimmed circle and polygon) are derived. Finally, the accuracy and applicability of the proposed algorithm are verified through both sufficient simulation and experiments.
利用单目视觉对机器人装配中的 3C 薄部件进行高精度 6-D 姿态测量的方法
3C 薄部件的 6-D 姿态测量对于机器人装配中的对准过程至关重要。由于厚度较薄,两个表面的特征耦合在一起,在图像中难以分辨,导致姿态测量的精度降低。本研究提出了一种基于单目视觉的 6-D 姿态测量方法。研究提出了一种名为 "双表面轮廓投影误差优化(DSRCEO)"的算法,通过同时优化两个表面的特征来提高姿态测量的精度。在 DSRCEO 中,首先构建计算域和图像域,然后通过综合考虑两个域得出 DSRCE 误差指数,以评估当前姿态估计的质量。通过最小化 DSRCE,对 6-D 姿态的初始估计进行迭代优化,从而不断提高测量精度。DSRCEO 框架适用于各种形状的薄部件,并推导出两种最常见形状(修剪圆和多边形)的具体算法实现。最后,通过充分的模拟和实验验证了所提算法的准确性和适用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
8.50
自引率
10.00%
发文量
1060
审稿时长
3.4 months
期刊介绍: Optics & Laser Technology aims to provide a vehicle for the publication of a broad range of high quality research and review papers in those fields of scientific and engineering research appertaining to the development and application of the technology of optics and lasers. Papers describing original work in these areas are submitted to rigorous refereeing prior to acceptance for publication. The scope of Optics & Laser Technology encompasses, but is not restricted to, the following areas: •development in all types of lasers •developments in optoelectronic devices and photonics •developments in new photonics and optical concepts •developments in conventional optics, optical instruments and components •techniques of optical metrology, including interferometry and optical fibre sensors •LIDAR and other non-contact optical measurement techniques, including optical methods in heat and fluid flow •applications of lasers to materials processing, optical NDT display (including holography) and optical communication •research and development in the field of laser safety including studies of hazards resulting from the applications of lasers (laser safety, hazards of laser fume) •developments in optical computing and optical information processing •developments in new optical materials •developments in new optical characterization methods and techniques •developments in quantum optics •developments in light assisted micro and nanofabrication methods and techniques •developments in nanophotonics and biophotonics •developments in imaging processing and systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信