Clemens Pohlt, Sebastian Hell, T. Schlegl, S. Wachsmuth
{"title":"基于手势的交互过程中人类自发输入对现实世界制造场景的影响","authors":"Clemens Pohlt, Sebastian Hell, T. Schlegl, S. Wachsmuth","doi":"10.1145/3125739.3132590","DOIUrl":null,"url":null,"abstract":"Seamless human-robot collaboration depends on high non-verbal behaviour recognition rates. To realize that in real-world manufacturing scenarios with an ecological valid setup, a lot of effort has to be invested. In this paper, we evaluate the impact of spontaneous inputs on the robustness of human-robot collaboration during gesture-based interaction. A high share of these spontaneous inputs lead to a reduced capability to predict behaviour and subsequently to a loss of robustness. We observe body and hand behaviour during interactive manufacturing of a collaborative task within two experiments. First, we analyse the occurrence frequency, reason and manner of human inputs in specific situations during a human-human experiment. We show the high impact of spontaneous inputs, especially in situations that differ from the typical working procedure. Second, we concentrate on implicit inputs during a real-world Wizard of Oz experiment using our human-robot working cell. We show that hand positions can be used to anticipate user needs in a semi-structured environment by applying knowledge about the semi-structured human behaviour which is distributed over working space and time in a typical manner.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Impact of Spontaneous Human Inputs during Gesture based Interaction on a Real-World Manufacturing Scenario\",\"authors\":\"Clemens Pohlt, Sebastian Hell, T. Schlegl, S. Wachsmuth\",\"doi\":\"10.1145/3125739.3132590\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Seamless human-robot collaboration depends on high non-verbal behaviour recognition rates. To realize that in real-world manufacturing scenarios with an ecological valid setup, a lot of effort has to be invested. In this paper, we evaluate the impact of spontaneous inputs on the robustness of human-robot collaboration during gesture-based interaction. A high share of these spontaneous inputs lead to a reduced capability to predict behaviour and subsequently to a loss of robustness. We observe body and hand behaviour during interactive manufacturing of a collaborative task within two experiments. First, we analyse the occurrence frequency, reason and manner of human inputs in specific situations during a human-human experiment. We show the high impact of spontaneous inputs, especially in situations that differ from the typical working procedure. Second, we concentrate on implicit inputs during a real-world Wizard of Oz experiment using our human-robot working cell. We show that hand positions can be used to anticipate user needs in a semi-structured environment by applying knowledge about the semi-structured human behaviour which is distributed over working space and time in a typical manner.\",\"PeriodicalId\":346669,\"journal\":{\"name\":\"Proceedings of the 5th International Conference on Human Agent Interaction\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 5th International Conference on Human Agent Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3125739.3132590\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Human Agent Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3125739.3132590","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Impact of Spontaneous Human Inputs during Gesture based Interaction on a Real-World Manufacturing Scenario
Seamless human-robot collaboration depends on high non-verbal behaviour recognition rates. To realize that in real-world manufacturing scenarios with an ecological valid setup, a lot of effort has to be invested. In this paper, we evaluate the impact of spontaneous inputs on the robustness of human-robot collaboration during gesture-based interaction. A high share of these spontaneous inputs lead to a reduced capability to predict behaviour and subsequently to a loss of robustness. We observe body and hand behaviour during interactive manufacturing of a collaborative task within two experiments. First, we analyse the occurrence frequency, reason and manner of human inputs in specific situations during a human-human experiment. We show the high impact of spontaneous inputs, especially in situations that differ from the typical working procedure. Second, we concentrate on implicit inputs during a real-world Wizard of Oz experiment using our human-robot working cell. We show that hand positions can be used to anticipate user needs in a semi-structured environment by applying knowledge about the semi-structured human behaviour which is distributed over working space and time in a typical manner.