基于变压器网络的工业机器人动力学建模

Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher
{"title":"基于变压器网络的工业机器人动力学建模","authors":"Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher","doi":"10.1109/IRC55401.2022.00035","DOIUrl":null,"url":null,"abstract":"Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.","PeriodicalId":282759,"journal":{"name":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Dynamics Modeling of Industrial Robots Using Transformer Networks\",\"authors\":\"Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher\",\"doi\":\"10.1109/IRC55401.2022.00035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.\",\"PeriodicalId\":282759,\"journal\":{\"name\":\"2022 Sixth IEEE International Conference on Robotic Computing (IRC)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Sixth IEEE International Conference on Robotic Computing (IRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IRC55401.2022.00035\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IRC55401.2022.00035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

利用解析模型对工业机器人进行动力学建模,需要对质量、重心、惯性张量等相关参数进行复杂的识别,容易产生误差。深度学习方法最近被用作一种替代方法。在这里,挑战不仅在于学习数据点之间的时间依赖性,还在于学习每个点的属性之间的依赖性。长短期记忆网络(LSTMs)作为时间序列处理的标准体系结构已被应用于该问题。然而,lstm不能充分利用过去十年中出现的并行化能力,导致耗时的训练过程。最近引入了变压器网络(变压器),以克服在学习数据中的时间依赖性时的长训练时间。它们可以进一步与卷积层结合,以学习多变量时间序列问题的属性之间的依赖关系。在本文中,我们证明了这些变压器可以用来准确地学习机器人的动力学模型。我们训练和测试了变压器的两种变体,有卷积层和没有卷积层,并将它们的结果与其他模型(如向量自回归、极端梯度增强和LSTM网络)进行比较。变压器,特别是卷积,在性能和预测精度方面优于其他模型。最后,使用可解释人工智能的方法评估表现最佳的网络的预测合理性,以增加用户的信任。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Dynamics Modeling of Industrial Robots Using Transformer Networks
Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信