Elif Ecem Ozkan, Tom Gurion, J. Hough, P. Healey, L. Jamone
{"title":"Speaker Motion Patterns during Self-repairs in Natural Dialogue","authors":"Elif Ecem Ozkan, Tom Gurion, J. Hough, P. Healey, L. Jamone","doi":"10.1145/3536220.3563684","DOIUrl":null,"url":null,"abstract":"An important milestone for any agent in interaction with humans on a regular basis is to achieve natural and efficient methods of communication. Such strategies should be derived on the hallmarks of human-human interaction. So far, the work in embodied conversational agents (ECAs) implementing such signals has been predominantly through imitating human-like positive back-channels, such as nodding, rather than active interaction. The field of Conversation Analysis (CA) focusing on natural human dialogue suggests that people continuously collaborate on achieving mutual understanding by frequently repairing misunderstandings as they happen. Detecting repairs from speech in real-time is challenging, even with state-of-the-art Natural Language Processing (NLP) models. We present specific human motion patterns during key moments of interaction, namely self initiated self-repairs, which would help agents to recognise and collaboratively solve speaker trouble. The features we present in this paper are the pairwise joint distances of head and hands which are more discriminative than the positions themselves.","PeriodicalId":186796,"journal":{"name":"Companion Publication of the 2022 International Conference on Multimodal Interaction","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion Publication of the 2022 International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3536220.3563684","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
An important milestone for any agent in interaction with humans on a regular basis is to achieve natural and efficient methods of communication. Such strategies should be derived on the hallmarks of human-human interaction. So far, the work in embodied conversational agents (ECAs) implementing such signals has been predominantly through imitating human-like positive back-channels, such as nodding, rather than active interaction. The field of Conversation Analysis (CA) focusing on natural human dialogue suggests that people continuously collaborate on achieving mutual understanding by frequently repairing misunderstandings as they happen. Detecting repairs from speech in real-time is challenging, even with state-of-the-art Natural Language Processing (NLP) models. We present specific human motion patterns during key moments of interaction, namely self initiated self-repairs, which would help agents to recognise and collaboratively solve speaker trouble. The features we present in this paper are the pairwise joint distances of head and hands which are more discriminative than the positions themselves.