Vision-based autonomous robots calibration for large-size workspace using ArUco map and single camera systems

IF 3.5 2区 工程技术 Q2 ENGINEERING, MANUFACTURING
Yuanhao Yin, Dong Gao, Kenan Deng, Yong Lu
{"title":"Vision-based autonomous robots calibration for large-size workspace using ArUco map and single camera systems","authors":"Yuanhao Yin,&nbsp;Dong Gao,&nbsp;Kenan Deng,&nbsp;Yong Lu","doi":"10.1016/j.precisioneng.2024.08.010","DOIUrl":null,"url":null,"abstract":"<div><p>The low positioning accuracy of industrial robots limits their application in industry. Vision-based kinematic calibration, known for its rapid processing and economic efficiency, is an effective solution to enhance this accuracy. However, most of these methods are constrained by the camera's field of view, limiting their effectiveness in large workspaces. This paper proposes a novel calibration framework composed of monocular vision and computer vision techniques using ArUco markers. Firstly, a robot positioning error model was established by considering the kinematic error based on the Modified Denavit-Hartenberg model. Subsequently, a calibrated camera was used to create an ArUco map as an alternative to traditional single calibration targets. The map was constructed by stitching images of ArUco markers with unique identifiers, and its accuracy was enhanced through closed-loop detection and global optimization that minimizes reprojection errors. Then, initial hand-eye parameters were determined, followed by acquiring the robot's end-effector pose through the ArUco map. The Levenberg-Marquardt algorithm was employed for calibration, involving iterative refinement of hand-eye and kinematic parameters. Finally, experimental validation was conducted on the KUKA kr500 industrial robot, with laser tracker measurements as the reference standard. Compared to the traditional checkerboard method, this new approach not only expands the calibration space but also significantly reduces the robot's absolute positioning error, from 1.359 mm to 0.472 mm.</p></div>","PeriodicalId":54589,"journal":{"name":"Precision Engineering-Journal of the International Societies for Precision Engineering and Nanotechnology","volume":"90 ","pages":"Pages 191-204"},"PeriodicalIF":3.5000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Precision Engineering-Journal of the International Societies for Precision Engineering and Nanotechnology","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0141635924001855","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
引用次数: 0

Abstract

The low positioning accuracy of industrial robots limits their application in industry. Vision-based kinematic calibration, known for its rapid processing and economic efficiency, is an effective solution to enhance this accuracy. However, most of these methods are constrained by the camera's field of view, limiting their effectiveness in large workspaces. This paper proposes a novel calibration framework composed of monocular vision and computer vision techniques using ArUco markers. Firstly, a robot positioning error model was established by considering the kinematic error based on the Modified Denavit-Hartenberg model. Subsequently, a calibrated camera was used to create an ArUco map as an alternative to traditional single calibration targets. The map was constructed by stitching images of ArUco markers with unique identifiers, and its accuracy was enhanced through closed-loop detection and global optimization that minimizes reprojection errors. Then, initial hand-eye parameters were determined, followed by acquiring the robot's end-effector pose through the ArUco map. The Levenberg-Marquardt algorithm was employed for calibration, involving iterative refinement of hand-eye and kinematic parameters. Finally, experimental validation was conducted on the KUKA kr500 industrial robot, with laser tracker measurements as the reference standard. Compared to the traditional checkerboard method, this new approach not only expands the calibration space but also significantly reduces the robot's absolute positioning error, from 1.359 mm to 0.472 mm.

使用 ArUco 地图和单摄像头系统对基于视觉的大型工作空间自主机器人进行校准
工业机器人的定位精度较低,限制了其在工业领域的应用。基于视觉的运动校准以其快速处理和经济高效而著称,是提高定位精度的有效解决方案。然而,这些方法大多受限于摄像头的视野,限制了它们在大型工作空间中的有效性。本文利用 ArUco 标记,提出了一种由单目视觉和计算机视觉技术组成的新型校准框架。首先,基于修正的 Denavit-Hartenberg 模型,通过考虑运动学误差建立了机器人定位误差模型。随后,使用校准过的摄像头创建 ArUco 地图,以替代传统的单一校准目标。该地图通过拼接具有唯一标识符的 ArUco 标记图像来构建,并通过闭环检测和全局优化来提高其精确度,从而最大限度地减少重投影误差。然后,确定初始手眼参数,接着通过 ArUco 地图获取机器人末端执行器的姿势。采用 Levenberg-Marquardt 算法进行校准,包括迭代改进手眼和运动参数。最后,以激光跟踪仪测量结果为参考标准,在库卡 kr500 工业机器人上进行了实验验证。与传统的棋盘式方法相比,这种新方法不仅扩大了校准空间,还显著降低了机器人的绝对定位误差,从 1.359 毫米降至 0.472 毫米。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.40
自引率
5.60%
发文量
177
审稿时长
46 days
期刊介绍: Precision Engineering - Journal of the International Societies for Precision Engineering and Nanotechnology is devoted to the multidisciplinary study and practice of high accuracy engineering, metrology, and manufacturing. The journal takes an integrated approach to all subjects related to research, design, manufacture, performance validation, and application of high precision machines, instruments, and components, including fundamental and applied research and development in manufacturing processes, fabrication technology, and advanced measurement science. The scope includes precision-engineered systems and supporting metrology over the full range of length scales, from atom-based nanotechnology and advanced lithographic technology to large-scale systems, including optical and radio telescopes and macrometrology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信