基于T-D-R框架的四足机器人视觉领导跟随方法

Lei Pang, Zhiqiang Cao, Junzhi Yu, Peiyu Guan, Xuewen Rong, Hui Chai
{"title":"基于T-D-R框架的四足机器人视觉领导跟随方法","authors":"Lei Pang, Zhiqiang Cao, Junzhi Yu, Peiyu Guan, Xuewen Rong, Hui Chai","doi":"10.1109/TSMC.2019.2912715","DOIUrl":null,"url":null,"abstract":"The quadruped robot imitates the motions of four-legged animals with a superior flexibility and adaptability to complex terrains, compared with the wheeled and tracked robots. Its leader-following ability is unique to help a human to accomplish complex tasks in a more convenient way. However, long-term following is severely obstructed due to the high-frequency vibration of the quadruped robot and the unevenness of terrains. To solve this problem, a visual approach under a novel T-D-R framework is proposed. The proposed T-D-R framework is composed of a visual tracker based on correlation filter, a person detector with deep learning, and a person re-identification (re-ID) module. The result of the tracker is verified by the detector to improve tracking performance. Especially, the re-ID module is introduced to handle distractions and occlusion caused by other persons, where the convolutional correlation filter (CCF) is employed to discriminate the leader among multiple persons through recording the appearance information in the long run. By comparing the results of the tracker and the detector as well as their similarity scores with the leader identified by the re-ID module, a stable and real-time tracking of the leader can be guaranteed. Experiments reveal that our approach is effective in handling distractions, appearance changes, and illumination variations. A long-distance experiment on a quadruped robot indicates the validity of the proposed approach.","PeriodicalId":55007,"journal":{"name":"IEEE Transactions on Systems Man and Cybernetics Part A-Systems and Humans","volume":"21 1","pages":"2342-2354"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"A Visual Leader-Following Approach With a T-D-R Framework for Quadruped Robots\",\"authors\":\"Lei Pang, Zhiqiang Cao, Junzhi Yu, Peiyu Guan, Xuewen Rong, Hui Chai\",\"doi\":\"10.1109/TSMC.2019.2912715\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The quadruped robot imitates the motions of four-legged animals with a superior flexibility and adaptability to complex terrains, compared with the wheeled and tracked robots. Its leader-following ability is unique to help a human to accomplish complex tasks in a more convenient way. However, long-term following is severely obstructed due to the high-frequency vibration of the quadruped robot and the unevenness of terrains. To solve this problem, a visual approach under a novel T-D-R framework is proposed. The proposed T-D-R framework is composed of a visual tracker based on correlation filter, a person detector with deep learning, and a person re-identification (re-ID) module. The result of the tracker is verified by the detector to improve tracking performance. Especially, the re-ID module is introduced to handle distractions and occlusion caused by other persons, where the convolutional correlation filter (CCF) is employed to discriminate the leader among multiple persons through recording the appearance information in the long run. By comparing the results of the tracker and the detector as well as their similarity scores with the leader identified by the re-ID module, a stable and real-time tracking of the leader can be guaranteed. Experiments reveal that our approach is effective in handling distractions, appearance changes, and illumination variations. A long-distance experiment on a quadruped robot indicates the validity of the proposed approach.\",\"PeriodicalId\":55007,\"journal\":{\"name\":\"IEEE Transactions on Systems Man and Cybernetics Part A-Systems and Humans\",\"volume\":\"21 1\",\"pages\":\"2342-2354\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Systems Man and Cybernetics Part A-Systems and Humans\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TSMC.2019.2912715\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Systems Man and Cybernetics Part A-Systems and Humans","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TSMC.2019.2912715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

与轮式和履带式机器人相比,四足机器人模仿四足动物的运动,具有优越的灵活性和对复杂地形的适应性。它的领导跟随能力是独一无二的,可以帮助人类以更方便的方式完成复杂的任务。然而,由于四足机器人的高频振动和地形的不平整,严重阻碍了长时间的跟随。为了解决这一问题,提出了一种新的T-D-R框架下的可视化方法。提出的T-D-R框架由基于相关滤波的视觉跟踪器、基于深度学习的人员检测器和人员再识别模块组成。跟踪器的结果由检测器进行验证,以提高跟踪性能。特别地,引入了re-ID模块来处理其他人引起的干扰和遮挡,其中使用卷积相关滤波器(CCF)通过记录长时间的外观信息在多人中区分领导者。通过比较跟踪器和检测器的结果以及它们与re-ID模块识别的leader的相似度得分,可以保证对leader的稳定实时跟踪。实验表明,我们的方法在处理干扰、外观变化和光照变化方面是有效的。在四足机器人上的远程实验表明了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Visual Leader-Following Approach With a T-D-R Framework for Quadruped Robots
The quadruped robot imitates the motions of four-legged animals with a superior flexibility and adaptability to complex terrains, compared with the wheeled and tracked robots. Its leader-following ability is unique to help a human to accomplish complex tasks in a more convenient way. However, long-term following is severely obstructed due to the high-frequency vibration of the quadruped robot and the unevenness of terrains. To solve this problem, a visual approach under a novel T-D-R framework is proposed. The proposed T-D-R framework is composed of a visual tracker based on correlation filter, a person detector with deep learning, and a person re-identification (re-ID) module. The result of the tracker is verified by the detector to improve tracking performance. Especially, the re-ID module is introduced to handle distractions and occlusion caused by other persons, where the convolutional correlation filter (CCF) is employed to discriminate the leader among multiple persons through recording the appearance information in the long run. By comparing the results of the tracker and the detector as well as their similarity scores with the leader identified by the re-ID module, a stable and real-time tracking of the leader can be guaranteed. Experiments reveal that our approach is effective in handling distractions, appearance changes, and illumination variations. A long-distance experiment on a quadruped robot indicates the validity of the proposed approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
1
审稿时长
6.0 months
期刊介绍: The scope of the IEEE Transactions on Systems, Man, and Cybernetics: Systems includes the fields of systems engineering. It includes issue formulation, analysis and modeling, decision making, and issue interpretation for any of the systems engineering lifecycle phases associated with the definition, development, and deployment of large systems. In addition, it includes systems management, systems engineering processes, and a variety of systems engineering methods such as optimization, modeling and simulation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信