Liming Zhang, Xiaohua Wang, Haoyi Wang, Pengfei Li
{"title":"基于视觉的人体上肢缝纫机器人系统","authors":"Liming Zhang, Xiaohua Wang, Haoyi Wang, Pengfei Li","doi":"10.5194/ms-14-347-2023","DOIUrl":null,"url":null,"abstract":"Abstract. In human–robot collaborative sewing, the robot follows the sewing action of a worker to complete the corresponding sewing action, which can enhance production efficiency. When the robot follows the sewing action of the worker through interactive information, it still faces the problem of low accuracy. In order to improve the accuracy of the robot following the sewing action, a human upper-limb sewing-action-following system based on visual information is designed in this paper. The system is composed of an improved OpenPose model, Gaussian mixture model (GMM), and Gaussian mixture regression (GMR). In the system, an improved OpenPose model is used to identify the sewing action of the human upper limb, and the label fusion method is used to correct the joint point labels when the upper limb is covered by fabric. Then the GMM is used to encode each motion element and time to obtain the regression work of the Gaussian component. GMR is adopted to predict connections between moving elements and generate sewing motion trajectories. Finally, the experimental verification and simulation are carried out in the experimental platform and simulation environment of the collaborative robot. The experimental results show that the tracking error angle can be controlled within 0.04 rad in the first 2 s of robot movement. Therefore, it can be considered that the sewing-action-following system can realize higher precision and promote the development of human–robot collaboration technology to a certain extent.\n","PeriodicalId":18413,"journal":{"name":"Mechanical Sciences","volume":" ","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A vision-based robotic system following the human upper-limb sewing action\",\"authors\":\"Liming Zhang, Xiaohua Wang, Haoyi Wang, Pengfei Li\",\"doi\":\"10.5194/ms-14-347-2023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract. In human–robot collaborative sewing, the robot follows the sewing action of a worker to complete the corresponding sewing action, which can enhance production efficiency. When the robot follows the sewing action of the worker through interactive information, it still faces the problem of low accuracy. In order to improve the accuracy of the robot following the sewing action, a human upper-limb sewing-action-following system based on visual information is designed in this paper. The system is composed of an improved OpenPose model, Gaussian mixture model (GMM), and Gaussian mixture regression (GMR). In the system, an improved OpenPose model is used to identify the sewing action of the human upper limb, and the label fusion method is used to correct the joint point labels when the upper limb is covered by fabric. Then the GMM is used to encode each motion element and time to obtain the regression work of the Gaussian component. GMR is adopted to predict connections between moving elements and generate sewing motion trajectories. Finally, the experimental verification and simulation are carried out in the experimental platform and simulation environment of the collaborative robot. The experimental results show that the tracking error angle can be controlled within 0.04 rad in the first 2 s of robot movement. Therefore, it can be considered that the sewing-action-following system can realize higher precision and promote the development of human–robot collaboration technology to a certain extent.\\n\",\"PeriodicalId\":18413,\"journal\":{\"name\":\"Mechanical Sciences\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2023-08-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Mechanical Sciences\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.5194/ms-14-347-2023\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, MECHANICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mechanical Sciences","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.5194/ms-14-347-2023","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, MECHANICAL","Score":null,"Total":0}
A vision-based robotic system following the human upper-limb sewing action
Abstract. In human–robot collaborative sewing, the robot follows the sewing action of a worker to complete the corresponding sewing action, which can enhance production efficiency. When the robot follows the sewing action of the worker through interactive information, it still faces the problem of low accuracy. In order to improve the accuracy of the robot following the sewing action, a human upper-limb sewing-action-following system based on visual information is designed in this paper. The system is composed of an improved OpenPose model, Gaussian mixture model (GMM), and Gaussian mixture regression (GMR). In the system, an improved OpenPose model is used to identify the sewing action of the human upper limb, and the label fusion method is used to correct the joint point labels when the upper limb is covered by fabric. Then the GMM is used to encode each motion element and time to obtain the regression work of the Gaussian component. GMR is adopted to predict connections between moving elements and generate sewing motion trajectories. Finally, the experimental verification and simulation are carried out in the experimental platform and simulation environment of the collaborative robot. The experimental results show that the tracking error angle can be controlled within 0.04 rad in the first 2 s of robot movement. Therefore, it can be considered that the sewing-action-following system can realize higher precision and promote the development of human–robot collaboration technology to a certain extent.
期刊介绍:
The journal Mechanical Sciences (MS) is an international forum for the dissemination of original contributions in the field of theoretical and applied mechanics. Its main ambition is to provide a platform for young researchers to build up a portfolio of high-quality peer-reviewed journal articles. To this end we employ an open-access publication model with moderate page charges, aiming for fast publication and great citation opportunities. A large board of reputable editors makes this possible. The journal will also publish special issues dealing with the current state of the art and future research directions in mechanical sciences. While in-depth research articles are preferred, review articles and short communications will also be considered. We intend and believe to provide a means of publication which complements established journals in the field.