{"title":"人机协作中移交自然指令的人类预测","authors":"Jens Lambrecht, Sebastian Nimpsch","doi":"10.1109/RO-MAN46459.2019.8956379","DOIUrl":null,"url":null,"abstract":"Human robot collaboration is aspiring to establish hybrid work environments in accordance with specific strengths of humans and robots. We present an approach of flexibly integrating robotic handover assistance into collaborative assembly tasks through the use of natural communication. For flexibly instructed handovers, we implement recent Convolutional Neural Networks in terms of object detection and grasping of arbitrary objects based on an RGB-D camera equipped to a robot following the eye-in-hand principle. In order to increase fluency and efficiency of the overall assembly process, we investigate the human ability to instruct the robot predictively with voice commands. We conduct a user study quantitatively and qualitatively evaluating the predictive instruction in order to achieve just-in-time handovers of tools needed for following subtasks. We compare our predictive strategy with a pure manual assembly having all tools in direct reach and a stepby-step reactive handover. The results reveal that the human is able to predict the handover comparable to algorithmbased predictors. Nevertheless, human prediction does not rely on extensive prior knowledge and is thus suitable for more flexible usage. However, the cognitive workload for the worker is increased compared to manual or reactive assembly.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Human Prediction for the Natural Instruction of Handovers in Human Robot Collaboration\",\"authors\":\"Jens Lambrecht, Sebastian Nimpsch\",\"doi\":\"10.1109/RO-MAN46459.2019.8956379\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Human robot collaboration is aspiring to establish hybrid work environments in accordance with specific strengths of humans and robots. We present an approach of flexibly integrating robotic handover assistance into collaborative assembly tasks through the use of natural communication. For flexibly instructed handovers, we implement recent Convolutional Neural Networks in terms of object detection and grasping of arbitrary objects based on an RGB-D camera equipped to a robot following the eye-in-hand principle. In order to increase fluency and efficiency of the overall assembly process, we investigate the human ability to instruct the robot predictively with voice commands. We conduct a user study quantitatively and qualitatively evaluating the predictive instruction in order to achieve just-in-time handovers of tools needed for following subtasks. We compare our predictive strategy with a pure manual assembly having all tools in direct reach and a stepby-step reactive handover. The results reveal that the human is able to predict the handover comparable to algorithmbased predictors. Nevertheless, human prediction does not rely on extensive prior knowledge and is thus suitable for more flexible usage. However, the cognitive workload for the worker is increased compared to manual or reactive assembly.\",\"PeriodicalId\":286478,\"journal\":{\"name\":\"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)\",\"volume\":\"65 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/RO-MAN46459.2019.8956379\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RO-MAN46459.2019.8956379","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Human Prediction for the Natural Instruction of Handovers in Human Robot Collaboration
Human robot collaboration is aspiring to establish hybrid work environments in accordance with specific strengths of humans and robots. We present an approach of flexibly integrating robotic handover assistance into collaborative assembly tasks through the use of natural communication. For flexibly instructed handovers, we implement recent Convolutional Neural Networks in terms of object detection and grasping of arbitrary objects based on an RGB-D camera equipped to a robot following the eye-in-hand principle. In order to increase fluency and efficiency of the overall assembly process, we investigate the human ability to instruct the robot predictively with voice commands. We conduct a user study quantitatively and qualitatively evaluating the predictive instruction in order to achieve just-in-time handovers of tools needed for following subtasks. We compare our predictive strategy with a pure manual assembly having all tools in direct reach and a stepby-step reactive handover. The results reveal that the human is able to predict the handover comparable to algorithmbased predictors. Nevertheless, human prediction does not rely on extensive prior knowledge and is thus suitable for more flexible usage. However, the cognitive workload for the worker is increased compared to manual or reactive assembly.