MotionGrasp: Long-Term Grasp Motion Tracking for Dynamic Grasping

IF 4.6 2区 计算机科学 Q2 ROBOTICS
Nuo Chen;Xiao-Ming Wu;Guohao Xu;Jian-Jian Jiang;Zibo Chen;Wei-Shi Zheng
{"title":"MotionGrasp: Long-Term Grasp Motion Tracking for Dynamic Grasping","authors":"Nuo Chen;Xiao-Ming Wu;Guohao Xu;Jian-Jian Jiang;Zibo Chen;Wei-Shi Zheng","doi":"10.1109/LRA.2024.3504792","DOIUrl":null,"url":null,"abstract":"Dynamic grasping, which aims to grasp moving objects in unstructured environment, is crucial for robotics community. Previous methods propose to track the initial grasps or objects by matching between the latest two frames. However, this neighbour-frame matching strategy ignores the long-term historical trajectory in tracking, resulting in accumulated error. To address this, we present a novel dynamic grasping framework, delicately taking the long-term trajectory into account in grasp tracking. To model the long-term trajectory well, we introduce the concept of Grasp Motion, the changes of grasps between frames, endowing the model with the dynamic modeling ability. Benefiting from the Grasp Motion, we are able to conduct accurate motion association, which associates the grasp generated in current frame to the long-term grasp trajectory and mitigates accumulated error. Moreover, since the generated grasps in current frame may not precisely align with the ground-truth grasp for the trajectory, which results in deviation when we put it into the trajectory for future association, we further design a motion alignment module to compensate it. Our experiments show that the MotionGrasp achieves great grasping performance in dynamic grasping, obtaining 20% increase compared to the previous SOTA method in the large-scale GraspNet-1billion dataset. Our experiments also verify that Grasp Motion is a key to the success of long-term modeling. The real-world experiments further verify the effectiveness of our method.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 1","pages":"796-803"},"PeriodicalIF":4.6000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10764717/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Dynamic grasping, which aims to grasp moving objects in unstructured environment, is crucial for robotics community. Previous methods propose to track the initial grasps or objects by matching between the latest two frames. However, this neighbour-frame matching strategy ignores the long-term historical trajectory in tracking, resulting in accumulated error. To address this, we present a novel dynamic grasping framework, delicately taking the long-term trajectory into account in grasp tracking. To model the long-term trajectory well, we introduce the concept of Grasp Motion, the changes of grasps between frames, endowing the model with the dynamic modeling ability. Benefiting from the Grasp Motion, we are able to conduct accurate motion association, which associates the grasp generated in current frame to the long-term grasp trajectory and mitigates accumulated error. Moreover, since the generated grasps in current frame may not precisely align with the ground-truth grasp for the trajectory, which results in deviation when we put it into the trajectory for future association, we further design a motion alignment module to compensate it. Our experiments show that the MotionGrasp achieves great grasping performance in dynamic grasping, obtaining 20% increase compared to the previous SOTA method in the large-scale GraspNet-1billion dataset. Our experiments also verify that Grasp Motion is a key to the success of long-term modeling. The real-world experiments further verify the effectiveness of our method.
MotionGrasp:用于动态抓取的长期抓取运动跟踪
动态抓取是机器人学界研究的热点之一,其目的是抓取非结构化环境中的运动物体。以前的方法建议通过匹配最近的两帧来跟踪初始抓取或对象。然而,这种邻框匹配策略在跟踪过程中忽略了长期的历史轨迹,导致误差累积。为了解决这个问题,我们提出了一种新的动态抓取框架,在抓取跟踪中巧妙地考虑了长期轨迹。为了更好地对长期轨迹进行建模,我们引入了抓取运动的概念,以及帧间抓取的变化,赋予了模型动态建模的能力。利用抓握运动,我们可以进行精确的运动关联,将当前帧中产生的抓握与长期的抓握轨迹相关联,减少累积误差。此外,由于当前帧中生成的抓点可能与轨迹的真地抓点不能精确对齐,导致我们将其放入轨迹中进行后续关联时产生偏差,我们进一步设计了运动对齐模块进行补偿。实验表明,在大规模的graspnet - 10亿数据集上,MotionGrasp在动态抓取方面取得了很好的抓取性能,比之前的SOTA方法提高了20%。我们的实验也验证了抓取运动是长期建模成功的关键。实际实验进一步验证了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Robotics and Automation Letters
IEEE Robotics and Automation Letters Computer Science-Computer Science Applications
CiteScore
9.60
自引率
15.40%
发文量
1428
期刊介绍: The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信