A Robust Object Tracking and Visual Servo Method for Mobile Robot

Lu-yi Chen, Mingdi Niu, Sheng Wang, Peng Wu, Yuanhao Li
{"title":"A Robust Object Tracking and Visual Servo Method for Mobile Robot","authors":"Lu-yi Chen, Mingdi Niu, Sheng Wang, Peng Wu, Yuanhao Li","doi":"10.1109/RCAR54675.2022.9872244","DOIUrl":null,"url":null,"abstract":"The general Siamese network based object tracking methods tend to generate the final score map from high-level features and treat features from each position equally, which may lead to the problems of large search region and low efficiency. In order to solve these, this paper proposes a fully-connected Siamese network tracking method based on the calculation of histogram of gradient feature similarity and on feedback of the fading-memory Kalman filter. This strategy enables real-time correction and compensation, which means it could re-track the target although it is occluded or temporarily lost. The target’s bounding box obtained by object tracking method is used to produce the control command and achieve the image-based visual servo. Comparative experiments with other methods are conducted on several public datasets to prove its effectiveness. In addition, we design a mobile robot tracking system to test the algorithmic performance in real-world scenarios. Experimental results show that the robot is able to track the target accurately, and continue to track the target despite occlusion or temporary disappearance.","PeriodicalId":304963,"journal":{"name":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCAR54675.2022.9872244","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The general Siamese network based object tracking methods tend to generate the final score map from high-level features and treat features from each position equally, which may lead to the problems of large search region and low efficiency. In order to solve these, this paper proposes a fully-connected Siamese network tracking method based on the calculation of histogram of gradient feature similarity and on feedback of the fading-memory Kalman filter. This strategy enables real-time correction and compensation, which means it could re-track the target although it is occluded or temporarily lost. The target’s bounding box obtained by object tracking method is used to produce the control command and achieve the image-based visual servo. Comparative experiments with other methods are conducted on several public datasets to prove its effectiveness. In addition, we design a mobile robot tracking system to test the algorithmic performance in real-world scenarios. Experimental results show that the robot is able to track the target accurately, and continue to track the target despite occlusion or temporary disappearance.
移动机器人鲁棒目标跟踪与视觉伺服方法
一般基于Siamese网络的目标跟踪方法倾向于从高级特征生成最终的分数图,并对每个位置的特征进行平等对待,这可能导致搜索区域大、效率低的问题。为了解决这些问题,本文提出了一种基于梯度特征相似度直方图计算和衰落记忆卡尔曼滤波反馈的全连通连网跟踪方法。这种策略可以实现实时校正和补偿,这意味着即使目标被遮挡或暂时丢失,它也可以重新跟踪目标。利用目标跟踪法获得的目标边界框生成控制命令,实现基于图像的视觉伺服。在多个公开数据集上与其他方法进行了对比实验,验证了该方法的有效性。此外,我们设计了一个移动机器人跟踪系统来测试算法在现实场景中的性能。实验结果表明,该机器人能够准确地跟踪目标,并在目标遮挡或暂时消失的情况下继续跟踪目标。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信