Analyzing Eye Movements in Interview Communication with Virtual Reality Agents

Fuhui Tian, S. Okada, K. Nitta
{"title":"Analyzing Eye Movements in Interview Communication with Virtual Reality Agents","authors":"Fuhui Tian, S. Okada, K. Nitta","doi":"10.1145/3349537.3351889","DOIUrl":null,"url":null,"abstract":"In human-agent interactions, human emotions and gestures expressed when interacting with agents is a high-level personally trait that quantifies human attitudes, intentions, motivations, and behaviors. The virtual reality space provides a chance to interact with virtual agents in a more immersive way. In this paper, we present a computational framework to analyze human eye movements by using a virtual reality system in a job interview scene. First, we developed a remote interview system using virtual agents and implemented the system into a virtual reality headset. Second, by tracking eye movements and collecting other multimodal data, the system could better analyze human personality traits in interview communication with virtual agents, and it could better support training in people's communication skills. In experiments, we analyzed the relationship between eye gaze feature and interview performance annotated by human experts. Experimental results showed acceptable accuracy value for the single modality of eye movement in the prediction of eye contact and total performance in job interviews.","PeriodicalId":188834,"journal":{"name":"Proceedings of the 7th International Conference on Human-Agent Interaction","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Human-Agent Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3349537.3351889","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

In human-agent interactions, human emotions and gestures expressed when interacting with agents is a high-level personally trait that quantifies human attitudes, intentions, motivations, and behaviors. The virtual reality space provides a chance to interact with virtual agents in a more immersive way. In this paper, we present a computational framework to analyze human eye movements by using a virtual reality system in a job interview scene. First, we developed a remote interview system using virtual agents and implemented the system into a virtual reality headset. Second, by tracking eye movements and collecting other multimodal data, the system could better analyze human personality traits in interview communication with virtual agents, and it could better support training in people's communication skills. In experiments, we analyzed the relationship between eye gaze feature and interview performance annotated by human experts. Experimental results showed acceptable accuracy value for the single modality of eye movement in the prediction of eye contact and total performance in job interviews.
基于虚拟现实代理的访谈交流眼动分析
在人-agent交互中,人类在与agent交互时所表达的情感和手势是一种高层次的个人特征,它量化了人类的态度、意图、动机和行为。虚拟现实空间提供了一个以更身临其境的方式与虚拟代理互动的机会。在本文中,我们提出了一个计算框架,通过使用虚拟现实系统来分析面试场景中的人眼运动。首先,我们开发了一个使用虚拟代理的远程面试系统,并将该系统实现到虚拟现实头显中。其次,通过跟踪眼球运动和收集其他多模态数据,系统可以更好地分析人在与虚拟代理的访谈沟通中的性格特征,更好地支持人的沟通技巧训练。在实验中,我们分析了眼睛注视特征与人类专家注释的采访表现之间的关系。实验结果表明,眼动单模态在预测面试中的眼神接触和整体表现方面具有可接受的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信