TossNet:学习如何利用运动感觉实时准确测量和预测机器人投掷任意物体的情况

IF 9.4 1区 计算机科学 Q1 ROBOTICS
Lipeng Chen;Weifeng Lu;Kun Zhang;Yizheng Zhang;Longfei Zhao;Yu Zheng
{"title":"TossNet:学习如何利用运动感觉实时准确测量和预测机器人投掷任意物体的情况","authors":"Lipeng Chen;Weifeng Lu;Kun Zhang;Yizheng Zhang;Longfei Zhao;Yu Zheng","doi":"10.1109/TRO.2024.3416009","DOIUrl":null,"url":null,"abstract":"Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.","PeriodicalId":50388,"journal":{"name":"IEEE Transactions on Robotics","volume":null,"pages":null},"PeriodicalIF":9.4000,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"TossNet: Learning to Accurately Measure and Predict Robot Throwing of Arbitrary Objects in Real Time With Proprioceptive Sensing\",\"authors\":\"Lipeng Chen;Weifeng Lu;Kun Zhang;Yizheng Zhang;Longfei Zhao;Yu Zheng\",\"doi\":\"10.1109/TRO.2024.3416009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.\",\"PeriodicalId\":50388,\"journal\":{\"name\":\"IEEE Transactions on Robotics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2024-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Robotics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10561530/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Robotics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10561530/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

摘要

由于高速机器人运动中固有的非线性、复杂性和不确定性,以及机器人与物体在极短的距离和时间内发生的高度动态交互,对机器人动态操纵(如抛掷和捕捉)进行精确测量和建模尤其具有挑战性。大多数研究利用视觉和触觉反馈等外在传感器来实现任务或以物体为中心的操纵动态建模,但由于成本高、复杂性大(如环境限制),这些研究可能会遇到瓶颈。在这项工作中,我们研究了仅使用机载本体感觉模式能否有效捕捉和描述动态操纵过程。特别是,我们提出了一种与物体无关的策略,从机器人抛掷动作的时空变化和腕力/扭矩(F/T)观测中学习机器人抛掷任意未知物体的动态。然后,我们提出了 TossNet,这是一种端到端方案,可联合测量机器人抛掷动力学并预测抛掷物体的飞行轨迹。模拟和实际场景中的实验结果表明,我们的方法可以准确模拟机器人抛掷可见和未知物体的动态,并以接近实时的超高预测精度预测其飞行轨迹。此外,还展示了消融结果,以证明每种本体感觉模式及其相关性在建立抛掷动态模型方面的有效性。案例研究表明,TossNet 可应用于各种真实机器人平台,以应对以抛掷为中心的机器人应用挑战,例如盲人杂耍和高精度机器人投球。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
TossNet: Learning to Accurately Measure and Predict Robot Throwing of Arbitrary Objects in Real Time With Proprioceptive Sensing
Accurate measuring and modeling of dynamic robot manipulation (e.g., tossing and catching) is particularly challenging, due to the inherent nonlinearity, complexity, and uncertainty in high-speed robot motions and highly dynamic robot–object interactions happening in very short distances and times. Most studies leverage extrinsic sensors such as visual and tactile feedback toward task or object-centric modeling of manipulation dynamics, which, however, may hit bottleneck due to the significant cost and complexity, e.g., the environmental restrictions. In this work, we investigate whether using solely the on-board proprioceptive sensory modalities can effectively capture and characterize dynamic manipulation processes. In particular, we present an object-agnostic strategy to learn the robot toss dynamics of arbitrary unknown objects from the spatio-temporal variations of robot toss movements and wrist-force/torque (F/T) observations. We then propose TossNet, an end-to-end formulation that jointly measures the robot toss dynamics and predicts the resulting flying trajectories of the tossed objects. Experimental results in both simulation and real-world scenarios demonstrate that our methods can accurately model the robot toss dynamics of both seen and unseen objects, and predict their flying trajectories with superior prediction accuracy in nearly real-time. Ablative results are also presented to demonstrate the effectiveness of each proprioceptive modality and their correlations in modeling the toss dynamics. Case studies show that TossNet can be applied on various real robot platforms for challenging tossing-centric robot applications, such as blind juggling and high-precise robot pitching.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Robotics
IEEE Transactions on Robotics 工程技术-机器人学
CiteScore
14.90
自引率
5.10%
发文量
259
审稿时长
6.0 months
期刊介绍: The IEEE Transactions on Robotics (T-RO) is dedicated to publishing fundamental papers covering all facets of robotics, drawing on interdisciplinary approaches from computer science, control systems, electrical engineering, mathematics, mechanical engineering, and beyond. From industrial applications to service and personal assistants, surgical operations to space, underwater, and remote exploration, robots and intelligent machines play pivotal roles across various domains, including entertainment, safety, search and rescue, military applications, agriculture, and intelligent vehicles. Special emphasis is placed on intelligent machines and systems designed for unstructured environments, where a significant portion of the environment remains unknown and beyond direct sensing or control.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信