{"title":"Robot Grasp Planning from Human Demonstration","authors":"Kaimeng Wang, Yongxiang Fan, I. Sakuma","doi":"10.1109/ICCAE56788.2023.10111294","DOIUrl":null,"url":null,"abstract":"Robot grasping is an essential capability to achieve the requirements of complex industrial tasks. Numerous studies have been done in this area to meet various practical needs. However, generating a stable grasp is still challenging due to the object geometry constraints and various purposes of the tasks. In this work, we propose a novel Programming-by-Demonstration based grasp planning framework that extracts human grasp skills (contact region and approach direction) from a single human demonstration and then formulates an optimization problem to generate a stable grasp with the extracted grasp skills. Instead of learning implicit synergies from human demonstration or mapping the dissimilar kinematics between the human hand and robot gripper, the proposed approach is able to learn an intuitive human intention that involves the potential contact region and the grasping approach direction. Furthermore, the introduced optimization formulation is able to search for the optimal grasp by minimizing the surface fitting error between the demonstrated contact regions on the object and the gripper finger surface, and penalizing the misalignment between the demonstrated approach direction and the approach direction of the gripper. A series of experiments are conducted to verify the effectiveness of the proposed algorithm in both simulation and the real world","PeriodicalId":406112,"journal":{"name":"2023 15th International Conference on Computer and Automation Engineering (ICCAE)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Computer and Automation Engineering (ICCAE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCAE56788.2023.10111294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Robot grasping is an essential capability to achieve the requirements of complex industrial tasks. Numerous studies have been done in this area to meet various practical needs. However, generating a stable grasp is still challenging due to the object geometry constraints and various purposes of the tasks. In this work, we propose a novel Programming-by-Demonstration based grasp planning framework that extracts human grasp skills (contact region and approach direction) from a single human demonstration and then formulates an optimization problem to generate a stable grasp with the extracted grasp skills. Instead of learning implicit synergies from human demonstration or mapping the dissimilar kinematics between the human hand and robot gripper, the proposed approach is able to learn an intuitive human intention that involves the potential contact region and the grasping approach direction. Furthermore, the introduced optimization formulation is able to search for the optimal grasp by minimizing the surface fitting error between the demonstrated contact regions on the object and the gripper finger surface, and penalizing the misalignment between the demonstrated approach direction and the approach direction of the gripper. A series of experiments are conducted to verify the effectiveness of the proposed algorithm in both simulation and the real world