Nuo Chen;Xiao-Ming Wu;Guohao Xu;Jian-Jian Jiang;Zibo Chen;Wei-Shi Zheng
{"title":"MotionGrasp: Long-Term Grasp Motion Tracking for Dynamic Grasping","authors":"Nuo Chen;Xiao-Ming Wu;Guohao Xu;Jian-Jian Jiang;Zibo Chen;Wei-Shi Zheng","doi":"10.1109/LRA.2024.3504792","DOIUrl":null,"url":null,"abstract":"Dynamic grasping, which aims to grasp moving objects in unstructured environment, is crucial for robotics community. Previous methods propose to track the initial grasps or objects by matching between the latest two frames. However, this neighbour-frame matching strategy ignores the long-term historical trajectory in tracking, resulting in accumulated error. To address this, we present a novel dynamic grasping framework, delicately taking the long-term trajectory into account in grasp tracking. To model the long-term trajectory well, we introduce the concept of Grasp Motion, the changes of grasps between frames, endowing the model with the dynamic modeling ability. Benefiting from the Grasp Motion, we are able to conduct accurate motion association, which associates the grasp generated in current frame to the long-term grasp trajectory and mitigates accumulated error. Moreover, since the generated grasps in current frame may not precisely align with the ground-truth grasp for the trajectory, which results in deviation when we put it into the trajectory for future association, we further design a motion alignment module to compensate it. Our experiments show that the MotionGrasp achieves great grasping performance in dynamic grasping, obtaining 20% increase compared to the previous SOTA method in the large-scale GraspNet-1billion dataset. Our experiments also verify that Grasp Motion is a key to the success of long-term modeling. The real-world experiments further verify the effectiveness of our method.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 1","pages":"796-803"},"PeriodicalIF":4.6000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10764717/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Dynamic grasping, which aims to grasp moving objects in unstructured environment, is crucial for robotics community. Previous methods propose to track the initial grasps or objects by matching between the latest two frames. However, this neighbour-frame matching strategy ignores the long-term historical trajectory in tracking, resulting in accumulated error. To address this, we present a novel dynamic grasping framework, delicately taking the long-term trajectory into account in grasp tracking. To model the long-term trajectory well, we introduce the concept of Grasp Motion, the changes of grasps between frames, endowing the model with the dynamic modeling ability. Benefiting from the Grasp Motion, we are able to conduct accurate motion association, which associates the grasp generated in current frame to the long-term grasp trajectory and mitigates accumulated error. Moreover, since the generated grasps in current frame may not precisely align with the ground-truth grasp for the trajectory, which results in deviation when we put it into the trajectory for future association, we further design a motion alignment module to compensate it. Our experiments show that the MotionGrasp achieves great grasping performance in dynamic grasping, obtaining 20% increase compared to the previous SOTA method in the large-scale GraspNet-1billion dataset. Our experiments also verify that Grasp Motion is a key to the success of long-term modeling. The real-world experiments further verify the effectiveness of our method.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.