{"title":"An Autonomous Grasping Control System Based on Visual Object Recognition and Tactile Perception","authors":"Jing An, Tong Li, Gang Chen, Q. Jia, Junpei Yu","doi":"10.1109/ICoSR57188.2022.00023","DOIUrl":null,"url":null,"abstract":"Despite the impressive progress of vision guidance in robot grasping, robots are not proficient in fine manipulation tasks, especially applying algorithms to real robot grasping systems where a single vision data does not solve the problem of contact perception of the target surface in fine grasping. So we propose a real-time robot grasping method based on position and contact force control, where vision is used for identify and locate the grasping target, and haptics is used to perform stable grasping. Using ZED camera and robotiq three-finger hand claw equipped with uskin haptic sensor, we trained the grasping targets with cylindrical or spherical shapes such as oranges and water bottles in CoCo dataset, and the results show that the grasping position error of the robot grasping system does not exceed 0.04m, the relative error not more than 6%, and the PID-based force control makes the grasping more stable, which proves the effectiveness of the grasping control method with visual-touch information integration.","PeriodicalId":234590,"journal":{"name":"2022 International Conference on Service Robotics (ICoSR)","volume":"102 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Service Robotics (ICoSR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICoSR57188.2022.00023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Despite the impressive progress of vision guidance in robot grasping, robots are not proficient in fine manipulation tasks, especially applying algorithms to real robot grasping systems where a single vision data does not solve the problem of contact perception of the target surface in fine grasping. So we propose a real-time robot grasping method based on position and contact force control, where vision is used for identify and locate the grasping target, and haptics is used to perform stable grasping. Using ZED camera and robotiq three-finger hand claw equipped with uskin haptic sensor, we trained the grasping targets with cylindrical or spherical shapes such as oranges and water bottles in CoCo dataset, and the results show that the grasping position error of the robot grasping system does not exceed 0.04m, the relative error not more than 6%, and the PID-based force control makes the grasping more stable, which proves the effectiveness of the grasping control method with visual-touch information integration.