{"title":"二元协同操作的好坏:识别人类合作中冲突驱动的互动行为","authors":"Illimar Issak, Ayse Kucukyilmaz","doi":"10.31256/fv3gn1l","DOIUrl":null,"url":null,"abstract":"—One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul-tiple modalities and adapt to one another’s actions such that the interaction is successful. Knowledge of the current interaction state (i.e. harmonious, conflicting or passive interaction) can help us adjust our behaviour to carry out the task successfully. This study investigates the effectiveness of physical Human- Human Interaction (pHHI) forces for predicting interaction states during ongoing object co-manipulation. We use a sliding-window method for extracting features and perform online classification to infer the interaction states. Our dataset consists of haptic data from 40 subjects who are partnered to form 20 dyads. The dyads performed collaborative object transfer tasks in a haptics- enabled virtual environment to move an object to predefined goal configurations in different harmonious and conflicting scenarios. We evaluate our approach using multi-class Support Vector Machine classifier (SVMc) and Gaussian Process classifier (GPc) and achieve 80% accuracy for classifying general interaction types.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration\",\"authors\":\"Illimar Issak, Ayse Kucukyilmaz\",\"doi\":\"10.31256/fv3gn1l\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"—One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul-tiple modalities and adapt to one another’s actions such that the interaction is successful. Knowledge of the current interaction state (i.e. harmonious, conflicting or passive interaction) can help us adjust our behaviour to carry out the task successfully. This study investigates the effectiveness of physical Human- Human Interaction (pHHI) forces for predicting interaction states during ongoing object co-manipulation. We use a sliding-window method for extracting features and perform online classification to infer the interaction states. Our dataset consists of haptic data from 40 subjects who are partnered to form 20 dyads. The dyads performed collaborative object transfer tasks in a haptics- enabled virtual environment to move an object to predefined goal configurations in different harmonious and conflicting scenarios. We evaluate our approach using multi-class Support Vector Machine classifier (SVMc) and Gaussian Process classifier (GPc) and achieve 80% accuracy for classifying general interaction types.\",\"PeriodicalId\":393014,\"journal\":{\"name\":\"UKRAS20 Conference: \\\"Robots into the real world\\\" Proceedings\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"UKRAS20 Conference: \\\"Robots into the real world\\\" Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.31256/fv3gn1l\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.31256/fv3gn1l","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration
—One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul-tiple modalities and adapt to one another’s actions such that the interaction is successful. Knowledge of the current interaction state (i.e. harmonious, conflicting or passive interaction) can help us adjust our behaviour to carry out the task successfully. This study investigates the effectiveness of physical Human- Human Interaction (pHHI) forces for predicting interaction states during ongoing object co-manipulation. We use a sliding-window method for extracting features and perform online classification to infer the interaction states. Our dataset consists of haptic data from 40 subjects who are partnered to form 20 dyads. The dyads performed collaborative object transfer tasks in a haptics- enabled virtual environment to move an object to predefined goal configurations in different harmonious and conflicting scenarios. We evaluate our approach using multi-class Support Vector Machine classifier (SVMc) and Gaussian Process classifier (GPc) and achieve 80% accuracy for classifying general interaction types.