{"title":"基于视觉手部运动预测的人机交互无碰撞轨迹规划","authors":"Yiwei Wang, Xin Ye, Yezhou Yang, Wenlong Zhang","doi":"10.1109/HUMANOIDS.2017.8246890","DOIUrl":null,"url":null,"abstract":"We present a framework from vision based hand movement prediction in a real-world human-robot collaborative scenario for safety guarantee. We first propose a perception submodule that takes in visual data solely and predicts human collaborator's hand movement. Then a robot trajectory adaptive planning submodule is developed that takes the noisy movement prediction signal into consideration for optimization. To validate the proposed systems, we first collect a new human manipulation dataset that can supplement the previous publicly available dataset with motion capture data to serve as the ground truth of hand location. We then integrate the algorithm with a six degree-of-freedom robot manipulator that can collaborate with human workers on a set of trained manipulation actions, and it is shown that such a robot system outperforms the one without movement prediction in terms of collision avoidance. We verify the effectiveness of the proposed motion prediction and robot trajectory planning approaches in both simulated and physical experiments. To the best of the authors' knowledge, it is the first time that a deep model based movement prediction system is utilized and is proven effective in human-robot collaboration scenario for enhanced safety.","PeriodicalId":143992,"journal":{"name":"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":"{\"title\":\"Collision-free trajectory planning in human-robot interaction through hand movement prediction from vision\",\"authors\":\"Yiwei Wang, Xin Ye, Yezhou Yang, Wenlong Zhang\",\"doi\":\"10.1109/HUMANOIDS.2017.8246890\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a framework from vision based hand movement prediction in a real-world human-robot collaborative scenario for safety guarantee. We first propose a perception submodule that takes in visual data solely and predicts human collaborator's hand movement. Then a robot trajectory adaptive planning submodule is developed that takes the noisy movement prediction signal into consideration for optimization. To validate the proposed systems, we first collect a new human manipulation dataset that can supplement the previous publicly available dataset with motion capture data to serve as the ground truth of hand location. We then integrate the algorithm with a six degree-of-freedom robot manipulator that can collaborate with human workers on a set of trained manipulation actions, and it is shown that such a robot system outperforms the one without movement prediction in terms of collision avoidance. We verify the effectiveness of the proposed motion prediction and robot trajectory planning approaches in both simulated and physical experiments. To the best of the authors' knowledge, it is the first time that a deep model based movement prediction system is utilized and is proven effective in human-robot collaboration scenario for enhanced safety.\",\"PeriodicalId\":143992,\"journal\":{\"name\":\"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"21\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HUMANOIDS.2017.8246890\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HUMANOIDS.2017.8246890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Collision-free trajectory planning in human-robot interaction through hand movement prediction from vision
We present a framework from vision based hand movement prediction in a real-world human-robot collaborative scenario for safety guarantee. We first propose a perception submodule that takes in visual data solely and predicts human collaborator's hand movement. Then a robot trajectory adaptive planning submodule is developed that takes the noisy movement prediction signal into consideration for optimization. To validate the proposed systems, we first collect a new human manipulation dataset that can supplement the previous publicly available dataset with motion capture data to serve as the ground truth of hand location. We then integrate the algorithm with a six degree-of-freedom robot manipulator that can collaborate with human workers on a set of trained manipulation actions, and it is shown that such a robot system outperforms the one without movement prediction in terms of collision avoidance. We verify the effectiveness of the proposed motion prediction and robot trajectory planning approaches in both simulated and physical experiments. To the best of the authors' knowledge, it is the first time that a deep model based movement prediction system is utilized and is proven effective in human-robot collaboration scenario for enhanced safety.