Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher
{"title":"基于变压器网络的工业机器人动力学建模","authors":"Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher","doi":"10.1109/IRC55401.2022.00035","DOIUrl":null,"url":null,"abstract":"Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.","PeriodicalId":282759,"journal":{"name":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Dynamics Modeling of Industrial Robots Using Transformer Networks\",\"authors\":\"Minh Trinh, Mohamed H. Behery, Mahmoud Emara, G. Lakemeyer, S. Storms, C. Brecher\",\"doi\":\"10.1109/IRC55401.2022.00035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.\",\"PeriodicalId\":282759,\"journal\":{\"name\":\"2022 Sixth IEEE International Conference on Robotic Computing (IRC)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Sixth IEEE International Conference on Robotic Computing (IRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IRC55401.2022.00035\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Sixth IEEE International Conference on Robotic Computing (IRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IRC55401.2022.00035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Dynamics Modeling of Industrial Robots Using Transformer Networks
Dynamics modeling of industrial robots using analytical models requires the complex identification of relevant parameters such as masses, centers of gravity as well as inertia tensors, which is often prone to error. Deep learning approaches have recently been used as an alternative. Here, the challenge lies not only in learning the temporal dependencies between the data points but also the dependencies between the attributes of each point. Long Short-term Memory networks (LSTMs) have been applied to this problem as the standard architecture for time series processing. However, LSTMs are not able to fully exploit parallellization capabilities that have emerged in the past decade leading to a time consuming training process. Transformer networks (transformers) have recently been introduced to overcome the long training times while learning temporal dependencies in the data. They can be further combined with convolutional layers to learn the dependencies between attributes for multivariate time series problems. In this paper we show that these transformers can be used to accurately learn the dynamics model of a robot. We train and test two variations of transformers, with and without convolutional layers, and compare their results to other models such as vector autoregression, extreme gradient boosting, and LSTM networks. The transformers, especially with convolution, outperformed the other models in terms of performance and prediction accuracy. Finally, the best performing network is evaluated regarding its prediction plausibility using a method from explainable artificial intelligence in order to increase the user’s trust.