A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data

D. Roth, Larrissa Brübach, Franziska Westermeier, Christian Schell, Tobias Feigl, Marc Erich Latoschik
{"title":"A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data","authors":"D. Roth, Larrissa Brübach, Franziska Westermeier, Christian Schell, Tobias Feigl, Marc Erich Latoschik","doi":"10.1145/3357251.3360018","DOIUrl":null,"url":null,"abstract":"In this demonstration we present a prototype for an avatar-mediated social interaction interface that supports the replication of head- and eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states. We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states.","PeriodicalId":370782,"journal":{"name":"Symposium on Spatial User Interaction","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Symposium on Spatial User Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3357251.3360018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In this demonstration we present a prototype for an avatar-mediated social interaction interface that supports the replication of head- and eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states. We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states.
基于神经元数据支持情感增强的社会交互界面
在这个演示中,我们展示了一个虚拟角色介导的社会互动界面的原型,该界面支持在分布式虚拟环境中复制头部和眼睛的运动。除了重新定位这些自然行为外,该系统还能够基于情感状态的视觉呈现来增强交互。我们利用脑电图(EEG)传感捕获的神经元数据,结合机器学习驱动的情绪状态分类,得出这些状态。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信