跟踪和融合与虚拟角色和社交机器人的多方互动

Zerrin Yumak, Jianfeng Ren, N. Magnenat-Thalmann, Junsong Yuan
{"title":"跟踪和融合与虚拟角色和社交机器人的多方互动","authors":"Zerrin Yumak, Jianfeng Ren, N. Magnenat-Thalmann, Junsong Yuan","doi":"10.1145/2668956.2668958","DOIUrl":null,"url":null,"abstract":"To give human-like capabilities to artificial characters, we should equip them with the ability of inferring user states. These artificial characters should understand the users' behaviors through various sensors and respond back using multimodal output. Besides natural multimodal interaction, they should also be able to communicate with multiple users and among each other in multiparty interactions. Previous work on interactive virtual humans and social robots mainly focuses on one-to-one interactions. In this paper, we study tracking and fusion aspects of multiparty interactions. We first give a general overview of our proposed multiparty interaction system and mention how it is different from previous work. Then, we provide the details of the tracking and fusion component including speaker identification, addressee detection and a dynamic user entrance/leave mechanism based on user re-identification using a Kinect sensor. Finally, we present a case study with the system and provide a discussion on the current capabilities, limitations and future work.","PeriodicalId":220010,"journal":{"name":"SIGGRAPH Asia 2014 Autonomous Virtual Humans and Social Robot for Telepresence","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Tracking and fusion for multiparty interaction with a virtual character and a social robot\",\"authors\":\"Zerrin Yumak, Jianfeng Ren, N. Magnenat-Thalmann, Junsong Yuan\",\"doi\":\"10.1145/2668956.2668958\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"To give human-like capabilities to artificial characters, we should equip them with the ability of inferring user states. These artificial characters should understand the users' behaviors through various sensors and respond back using multimodal output. Besides natural multimodal interaction, they should also be able to communicate with multiple users and among each other in multiparty interactions. Previous work on interactive virtual humans and social robots mainly focuses on one-to-one interactions. In this paper, we study tracking and fusion aspects of multiparty interactions. We first give a general overview of our proposed multiparty interaction system and mention how it is different from previous work. Then, we provide the details of the tracking and fusion component including speaker identification, addressee detection and a dynamic user entrance/leave mechanism based on user re-identification using a Kinect sensor. Finally, we present a case study with the system and provide a discussion on the current capabilities, limitations and future work.\",\"PeriodicalId\":220010,\"journal\":{\"name\":\"SIGGRAPH Asia 2014 Autonomous Virtual Humans and Social Robot for Telepresence\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-11-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIGGRAPH Asia 2014 Autonomous Virtual Humans and Social Robot for Telepresence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2668956.2668958\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIGGRAPH Asia 2014 Autonomous Virtual Humans and Social Robot for Telepresence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2668956.2668958","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

摘要

为了赋予人工角色类似人类的能力,我们应该让它们具备推断用户状态的能力。这些人工角色应该通过各种传感器理解用户的行为,并使用多模态输出进行响应。除了自然的多模态交互外,它们还应该能够与多个用户进行通信,并在多方交互中相互通信。之前关于互动虚拟人和社交机器人的研究主要集中在一对一的互动上。本文主要研究了多方交互的跟踪和融合问题。我们首先给出了我们提出的多方交互系统的总体概述,并提到它与以前工作的不同之处。然后,我们提供了跟踪和融合组件的细节,包括说话人识别、收件人检测和基于使用Kinect传感器的用户重新识别的动态用户进出机制。最后,通过一个案例对该系统进行了分析,并对系统目前的能力、局限性和未来的工作进行了讨论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Tracking and fusion for multiparty interaction with a virtual character and a social robot
To give human-like capabilities to artificial characters, we should equip them with the ability of inferring user states. These artificial characters should understand the users' behaviors through various sensors and respond back using multimodal output. Besides natural multimodal interaction, they should also be able to communicate with multiple users and among each other in multiparty interactions. Previous work on interactive virtual humans and social robots mainly focuses on one-to-one interactions. In this paper, we study tracking and fusion aspects of multiparty interactions. We first give a general overview of our proposed multiparty interaction system and mention how it is different from previous work. Then, we provide the details of the tracking and fusion component including speaker identification, addressee detection and a dynamic user entrance/leave mechanism based on user re-identification using a Kinect sensor. Finally, we present a case study with the system and provide a discussion on the current capabilities, limitations and future work.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信