{"title":"仿人机器人探测和攀爬梯子的仿真","authors":"Prashanta Gyawali, J. McGough","doi":"10.1109/EIT.2013.6632710","DOIUrl":null,"url":null,"abstract":"The most recent DARPA challenge presents an industrial accident scenerio for a humanoid robot to traverse and then perform human oriented tasks. In the fifth stage of the challenge, the robot must climb a wall mounted metal rung ladder. To accomplish this task, the robot must first recognize and localize the ladder prior to grasping and climbing. This paper presents the localization of the rungs using point cloud data from a simulated Microsoft Kinect sensor. It also presents grasping and climbing manuveur using PR2 Robot. The basic approach is to first segment out the background planes. We apply a voxel grid filter to make the computation faster. Then using the RANSAC algorithm, lines that represent the legs and the interior rung midline are extracted. Vertical lines are thrown away and only the lines that represent the rungs are retained. The center of the computed line is our estimated location for the rung centroid. We then can use the centroid information for the PR2 grasping.","PeriodicalId":201202,"journal":{"name":"IEEE International Conference on Electro-Information Technology , EIT 2013","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Simulation of detecting and climbing a ladder for a humanoid robot\",\"authors\":\"Prashanta Gyawali, J. McGough\",\"doi\":\"10.1109/EIT.2013.6632710\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The most recent DARPA challenge presents an industrial accident scenerio for a humanoid robot to traverse and then perform human oriented tasks. In the fifth stage of the challenge, the robot must climb a wall mounted metal rung ladder. To accomplish this task, the robot must first recognize and localize the ladder prior to grasping and climbing. This paper presents the localization of the rungs using point cloud data from a simulated Microsoft Kinect sensor. It also presents grasping and climbing manuveur using PR2 Robot. The basic approach is to first segment out the background planes. We apply a voxel grid filter to make the computation faster. Then using the RANSAC algorithm, lines that represent the legs and the interior rung midline are extracted. Vertical lines are thrown away and only the lines that represent the rungs are retained. The center of the computed line is our estimated location for the rung centroid. We then can use the centroid information for the PR2 grasping.\",\"PeriodicalId\":201202,\"journal\":{\"name\":\"IEEE International Conference on Electro-Information Technology , EIT 2013\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE International Conference on Electro-Information Technology , EIT 2013\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EIT.2013.6632710\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Conference on Electro-Information Technology , EIT 2013","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EIT.2013.6632710","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Simulation of detecting and climbing a ladder for a humanoid robot
The most recent DARPA challenge presents an industrial accident scenerio for a humanoid robot to traverse and then perform human oriented tasks. In the fifth stage of the challenge, the robot must climb a wall mounted metal rung ladder. To accomplish this task, the robot must first recognize and localize the ladder prior to grasping and climbing. This paper presents the localization of the rungs using point cloud data from a simulated Microsoft Kinect sensor. It also presents grasping and climbing manuveur using PR2 Robot. The basic approach is to first segment out the background planes. We apply a voxel grid filter to make the computation faster. Then using the RANSAC algorithm, lines that represent the legs and the interior rung midline are extracted. Vertical lines are thrown away and only the lines that represent the rungs are retained. The center of the computed line is our estimated location for the rung centroid. We then can use the centroid information for the PR2 grasping.