Comparing Position- and Image-Based Visual Servoing for Robotic Assembly of Large Structures

Yuan-Chih Peng, Devavrat Jivani, R. Radke, J. Wen
{"title":"Comparing Position- and Image-Based Visual Servoing for Robotic Assembly of Large Structures","authors":"Yuan-Chih Peng, Devavrat Jivani, R. Radke, J. Wen","doi":"10.1109/CASE48305.2020.9217028","DOIUrl":null,"url":null,"abstract":"This paper considers image-guided assembly for large composite panels. By using fiducial markers on the panels and robot gripper mounted cameras, we are able to use an industrial robot to align the panels to sub-millimeter accuracy. We considered two commonly used visual servoing schemes: position-based visual servoing (PBVS) and image-based visual servoing (IBVS). It has been noted that IBVS possesses superior robustness with respect to the camera calibration accuracy. However, we have found that in our case, PBVS is both faster and slightly more accurate than IBVS. This result is due to the fact that the visual servoing target in the image plane is derived from a reference target, which depends on the accuracy of the camera model. This additional dependency essentially nullifies the robustness advantage of IBVS. We also implemented a simple scheme to combine inputs from multiple cameras to improve the visual servoing accuracy. Both simulation and experimental results are included to show the effectiveness of visual servoing in an industrial setting.","PeriodicalId":212181,"journal":{"name":"2020 IEEE 16th International Conference on Automation Science and Engineering (CASE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 16th International Conference on Automation Science and Engineering (CASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CASE48305.2020.9217028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

This paper considers image-guided assembly for large composite panels. By using fiducial markers on the panels and robot gripper mounted cameras, we are able to use an industrial robot to align the panels to sub-millimeter accuracy. We considered two commonly used visual servoing schemes: position-based visual servoing (PBVS) and image-based visual servoing (IBVS). It has been noted that IBVS possesses superior robustness with respect to the camera calibration accuracy. However, we have found that in our case, PBVS is both faster and slightly more accurate than IBVS. This result is due to the fact that the visual servoing target in the image plane is derived from a reference target, which depends on the accuracy of the camera model. This additional dependency essentially nullifies the robustness advantage of IBVS. We also implemented a simple scheme to combine inputs from multiple cameras to improve the visual servoing accuracy. Both simulation and experimental results are included to show the effectiveness of visual servoing in an industrial setting.
大型结构机器人装配中基于位置和图像的视觉伺服比较
本文研究了大型复合板的图像引导装配问题。通过在面板上使用基准标记和安装在机器人抓手上的相机,我们能够使用工业机器人将面板对齐到亚毫米精度。我们考虑了两种常用的视觉伺服方案:基于位置的视觉伺服(PBVS)和基于图像的视觉伺服(IBVS)。已经注意到,IBVS在相机校准精度方面具有优越的鲁棒性。然而,我们发现在我们的案例中,PBVS比IBVS更快,更准确。这是由于图像平面上的视觉伺服目标来源于参考目标,这取决于相机模型的精度。这种额外的依赖关系实质上抵消了IBVS的健壮性优势。我们还实现了一个简单的方案,将来自多个摄像机的输入组合在一起,以提高视觉伺服精度。仿真和实验结果都表明了视觉伺服在工业环境中的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信