Xiaohan Chen, Yihui Li, Y. Guan, Wenjing Shi, Jiajun Wu
{"title":"人机交互任务空间和关节空间的概率融合","authors":"Xiaohan Chen, Yihui Li, Y. Guan, Wenjing Shi, Jiajun Wu","doi":"10.1109/ROBIO55434.2022.10011691","DOIUrl":null,"url":null,"abstract":"As Human-Robot Interaction (HRI) develops, robots are expected to learn more complex and demanding interaction skills. Complex HRI tasks are often embodied in robots that are jointly constrained by task space and joint space. In the field of Imitation Learning, scholars have explored the topic of joint constraints from task space and joint space, but there are few relevant studies in Human-Robot Interaction. In this paper, based on the Interaction Primitives framework (a HRI framework), we propose an interaction inference method that first generalizes the robot's movements in two spaces synchronously and then probabilistically fuses the two movements based on Bayesian estimation. This work was validated in the task that the robot follows a human handheld object, and the inference errors (RMSE and MAE) of the method are smaller in both task space and joint space than in IP using only singlespace inference.","PeriodicalId":151112,"journal":{"name":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Probabilistic Fusion in Task Space and Joint Space for Human-Robot Interaction\",\"authors\":\"Xiaohan Chen, Yihui Li, Y. Guan, Wenjing Shi, Jiajun Wu\",\"doi\":\"10.1109/ROBIO55434.2022.10011691\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As Human-Robot Interaction (HRI) develops, robots are expected to learn more complex and demanding interaction skills. Complex HRI tasks are often embodied in robots that are jointly constrained by task space and joint space. In the field of Imitation Learning, scholars have explored the topic of joint constraints from task space and joint space, but there are few relevant studies in Human-Robot Interaction. In this paper, based on the Interaction Primitives framework (a HRI framework), we propose an interaction inference method that first generalizes the robot's movements in two spaces synchronously and then probabilistically fuses the two movements based on Bayesian estimation. This work was validated in the task that the robot follows a human handheld object, and the inference errors (RMSE and MAE) of the method are smaller in both task space and joint space than in IP using only singlespace inference.\",\"PeriodicalId\":151112,\"journal\":{\"name\":\"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROBIO55434.2022.10011691\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO55434.2022.10011691","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Probabilistic Fusion in Task Space and Joint Space for Human-Robot Interaction
As Human-Robot Interaction (HRI) develops, robots are expected to learn more complex and demanding interaction skills. Complex HRI tasks are often embodied in robots that are jointly constrained by task space and joint space. In the field of Imitation Learning, scholars have explored the topic of joint constraints from task space and joint space, but there are few relevant studies in Human-Robot Interaction. In this paper, based on the Interaction Primitives framework (a HRI framework), we propose an interaction inference method that first generalizes the robot's movements in two spaces synchronously and then probabilistically fuses the two movements based on Bayesian estimation. This work was validated in the task that the robot follows a human handheld object, and the inference errors (RMSE and MAE) of the method are smaller in both task space and joint space than in IP using only singlespace inference.