Xiaohan Chen, Yihui Li, Y. Guan, Wenjing Shi, Jiajun Wu
{"title":"Probabilistic Fusion in Task Space and Joint Space for Human-Robot Interaction","authors":"Xiaohan Chen, Yihui Li, Y. Guan, Wenjing Shi, Jiajun Wu","doi":"10.1109/ROBIO55434.2022.10011691","DOIUrl":null,"url":null,"abstract":"As Human-Robot Interaction (HRI) develops, robots are expected to learn more complex and demanding interaction skills. Complex HRI tasks are often embodied in robots that are jointly constrained by task space and joint space. In the field of Imitation Learning, scholars have explored the topic of joint constraints from task space and joint space, but there are few relevant studies in Human-Robot Interaction. In this paper, based on the Interaction Primitives framework (a HRI framework), we propose an interaction inference method that first generalizes the robot's movements in two spaces synchronously and then probabilistically fuses the two movements based on Bayesian estimation. This work was validated in the task that the robot follows a human handheld object, and the inference errors (RMSE and MAE) of the method are smaller in both task space and joint space than in IP using only singlespace inference.","PeriodicalId":151112,"journal":{"name":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO55434.2022.10011691","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
As Human-Robot Interaction (HRI) develops, robots are expected to learn more complex and demanding interaction skills. Complex HRI tasks are often embodied in robots that are jointly constrained by task space and joint space. In the field of Imitation Learning, scholars have explored the topic of joint constraints from task space and joint space, but there are few relevant studies in Human-Robot Interaction. In this paper, based on the Interaction Primitives framework (a HRI framework), we propose an interaction inference method that first generalizes the robot's movements in two spaces synchronously and then probabilistically fuses the two movements based on Bayesian estimation. This work was validated in the task that the robot follows a human handheld object, and the inference errors (RMSE and MAE) of the method are smaller in both task space and joint space than in IP using only singlespace inference.