{"title":"TossNet:学习如何利用运动感觉实时准确测量和预测机器人投掷任意物体的情况","authors":"Lipeng Chen;Weifeng Lu;Kun Zhang;Yizheng Zhang;Longfei Zhao;Yu Zheng","doi":"10.1109/TRO.2024.3416009","DOIUrl":null,"url":null,"abstract":"Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.","PeriodicalId":50388,"journal":{"name":"IEEE Transactions on Robotics","volume":null,"pages":null},"PeriodicalIF":9.4000,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"TossNet: Learning to Accurately Measure and Predict Robot Throwing of Arbitrary Objects in Real Time With Proprioceptive Sensing\",\"authors\":\"Lipeng Chen;Weifeng Lu;Kun Zhang;Yizheng Zhang;Longfei Zhao;Yu Zheng\",\"doi\":\"10.1109/TRO.2024.3416009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.\",\"PeriodicalId\":50388,\"journal\":{\"name\":\"IEEE Transactions on Robotics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2024-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Robotics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10561530/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Robotics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10561530/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
TossNet: Learning to Accurately Measure and Predict Robot Throwing of Arbitrary Objects in Real Time With Proprioceptive Sensing
Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.
期刊介绍:
The IEEE Transactions on Robotics (T-RO) is dedicated to publishing fundamental papers covering all facets of robotics, drawing on interdisciplinary approaches from computer science, control systems, electrical engineering, mathematics, mechanical engineering, and beyond. From industrial applications to service and personal assistants, surgical operations to space, underwater, and remote exploration, robots and intelligent machines play pivotal roles across various domains, including entertainment, safety, search and rescue, military applications, agriculture, and intelligent vehicles.
Special emphasis is placed on intelligent machines and systems designed for unstructured environments, where a significant portion of the environment remains unknown and beyond direct sensing or control.