Visual tracking of an end-effector by adaptive kinematic prediction

A. Ruf, M. Tonko, R. Horaud, H. Nagel
{"title":"Visual tracking of an end-effector by adaptive kinematic prediction","authors":"A. Ruf, M. Tonko, R. Horaud, H. Nagel","doi":"10.1109/IROS.1997.655115","DOIUrl":null,"url":null,"abstract":"Presents results of a model-based approach to visual tracking and pose estimation for a moving polyhedral tool in position-based visual servoing. This enables the control of a robot in look-and-move mode to achieve six degree of freedom goal configurations. Robust solutions of the correspondence problem-known as \"matching\" in the static case and \"tracking\" in the dynamic one-are crucial to the feasibility of such an approach in real-world environments. The object's motion along an arbitrary trajectory in space is tracked using visual pose estimates through consecutive images. Subsequent positions are predicted from robot joint angle measurements. To deal with inaccurate models and to relax calibration requirements, adaptive online calibration of the kinematic chain is proposed. The kinematic predictions enable unambiguous feature matching by a pessimistic algorithm. The performance of the suggested algorithms and the robustness of the proposed system are evaluated on real image sequences of a moving gripper. The results fulfill the requirements of visual servoing, and the computational demands are sufficiently low to allow for real-time implementation.","PeriodicalId":408848,"journal":{"name":"Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1997-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"44","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.1997.655115","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 44

Abstract

Presents results of a model-based approach to visual tracking and pose estimation for a moving polyhedral tool in position-based visual servoing. This enables the control of a robot in look-and-move mode to achieve six degree of freedom goal configurations. Robust solutions of the correspondence problem-known as "matching" in the static case and "tracking" in the dynamic one-are crucial to the feasibility of such an approach in real-world environments. The object's motion along an arbitrary trajectory in space is tracked using visual pose estimates through consecutive images. Subsequent positions are predicted from robot joint angle measurements. To deal with inaccurate models and to relax calibration requirements, adaptive online calibration of the kinematic chain is proposed. The kinematic predictions enable unambiguous feature matching by a pessimistic algorithm. The performance of the suggested algorithms and the robustness of the proposed system are evaluated on real image sequences of a moving gripper. The results fulfill the requirements of visual servoing, and the computational demands are sufficiently low to allow for real-time implementation.
基于自适应运动预测的末端执行器视觉跟踪
介绍了一种基于模型的基于位置的视觉伺服中运动多面体刀具的视觉跟踪和姿态估计方法。这使得机器人在观察和移动模式下的控制能够实现六个自由度的目标配置。对应问题(在静态情况下称为“匹配”,在动态情况下称为“跟踪”)的鲁棒解决方案对于这种方法在实际环境中的可行性至关重要。通过连续图像使用视觉姿态估计来跟踪物体沿空间任意轨迹的运动。从机器人关节角度测量中预测后续位置。为了解决模型不准确的问题,降低标定要求,提出了运动链的自适应在线标定方法。运动学预测通过悲观算法实现无二义性特征匹配。在实际图像序列上对所提算法的性能和系统的鲁棒性进行了评价。结果满足视觉伺服的要求,并且计算需求足够低,可以实时实现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信