{"title":"模仿任务中协调水平的自动识别","authors":"E. Delaherche, M. Chetouani","doi":"10.1145/2072572.2072582","DOIUrl":null,"url":null,"abstract":"Automatic analysis of human-human degree of coordination bears challenging questions. In this paper, we propose to automatically predict the degree of coordination between dyadic partners performing an imitation task. A subjective evaluation of their coordination was obtained via a questionnaire addressed to human judges. We extracted features from speech, gestures segmentation and synchronized movements to predict the coordination status of each dyad. Several features discriminated perfectly the examples from the low and high coordination classes.","PeriodicalId":404943,"journal":{"name":"J-HGBU '11","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Automatic recognition of coordination level in an imitation task\",\"authors\":\"E. Delaherche, M. Chetouani\",\"doi\":\"10.1145/2072572.2072582\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Automatic analysis of human-human degree of coordination bears challenging questions. In this paper, we propose to automatically predict the degree of coordination between dyadic partners performing an imitation task. A subjective evaluation of their coordination was obtained via a questionnaire addressed to human judges. We extracted features from speech, gestures segmentation and synchronized movements to predict the coordination status of each dyad. Several features discriminated perfectly the examples from the low and high coordination classes.\",\"PeriodicalId\":404943,\"journal\":{\"name\":\"J-HGBU '11\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"J-HGBU '11\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2072572.2072582\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"J-HGBU '11","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2072572.2072582","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automatic recognition of coordination level in an imitation task
Automatic analysis of human-human degree of coordination bears challenging questions. In this paper, we propose to automatically predict the degree of coordination between dyadic partners performing an imitation task. A subjective evaluation of their coordination was obtained via a questionnaire addressed to human judges. We extracted features from speech, gestures segmentation and synchronized movements to predict the coordination status of each dyad. Several features discriminated perfectly the examples from the low and high coordination classes.