MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception

Daniel Butterfield, Sandilya Sai Garimella, Nai-Jen Cheng, Lu Gan
{"title":"MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception","authors":"Daniel Butterfield, Sandilya Sai Garimella, Nai-Jen Cheng, Lu Gan","doi":"arxiv-2409.11146","DOIUrl":null,"url":null,"abstract":"We present a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN)\nfor learning-based contact perception. The architecture and connectivity of the\nMI-HGNN are constructed from the robot morphology, in which nodes and edges are\nrobot joints and links, respectively. By incorporating the morphology-informed\nconstraints into a neural network, we improve a learning-based approach using\nmodel-based knowledge. We apply the proposed MI-HGNN to two contact perception\nproblems, and conduct extensive experiments using both real-world and simulated\ndata collected using two quadruped robots. Our experiments demonstrate the\nsuperiority of our method in terms of effectiveness, generalization ability,\nmodel efficiency, and sample efficiency. Our MI-HGNN improved the performance\nof a state-of-the-art model that leverages robot morphological symmetry by 8.4%\nwith only 0.21% of its parameters. Although MI-HGNN is applied to contact\nperception problems for legged robots in this work, it can be seamlessly\napplied to other types of multi-body dynamical systems and has the potential to\nimprove other robot learning frameworks. Our code is made publicly available at\nhttps://github.com/lunarlab-gatech/Morphology-Informed-HGNN.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11146","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We present a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for learning-based contact perception. The architecture and connectivity of the MI-HGNN are constructed from the robot morphology, in which nodes and edges are robot joints and links, respectively. By incorporating the morphology-informed constraints into a neural network, we improve a learning-based approach using model-based knowledge. We apply the proposed MI-HGNN to two contact perception problems, and conduct extensive experiments using both real-world and simulated data collected using two quadruped robots. Our experiments demonstrate the superiority of our method in terms of effectiveness, generalization ability, model efficiency, and sample efficiency. Our MI-HGNN improved the performance of a state-of-the-art model that leverages robot morphological symmetry by 8.4% with only 0.21% of its parameters. Although MI-HGNN is applied to contact perception problems for legged robots in this work, it can be seamlessly applied to other types of multi-body dynamical systems and has the potential to improve other robot learning frameworks. Our code is made publicly available at https://github.com/lunarlab-gatech/Morphology-Informed-HGNN.
MI-HGNN:用于腿部机器人接触感知的形态信息异构图神经网络
我们提出了一种用于基于学习的接触感知的形态信息异构图神经网络(MI-HGNN)。MI-HGNN的结构和连通性是根据机器人形态构建的,其中节点和边分别是机器人的关节和链接。通过将形态学信息约束纳入神经网络,我们利用基于模型的知识改进了基于学习的方法。我们将提出的 MI-HGNN 应用于两个接触感知问题,并使用两个四足机器人收集的真实世界数据和模拟数据进行了大量实验。实验证明,我们的方法在有效性、泛化能力、模型效率和样本效率方面都更胜一筹。我们的 MI-HGNN 仅用 0.21% 的参数就将利用机器人形态对称性的最先进模型的性能提高了 8.4%。虽然 MI-HGNN 在这项工作中应用于腿部机器人的接触感知问题,但它可以无缝应用于其他类型的多体动力学系统,并有可能改进其他机器人学习框架。我们的代码可在https://github.com/lunarlab-gatech/Morphology-Informed-HGNN。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信