Speech, Head, and Eye-based Cues for Continuous Affect Prediction

J. O'Dwyer
{"title":"Speech, Head, and Eye-based Cues for Continuous Affect Prediction","authors":"J. O'Dwyer","doi":"10.1109/ACIIW.2019.8925042","DOIUrl":null,"url":null,"abstract":"Continuous affect prediction involves the discrete time-continuous regression of affect dimensions. Researchers in this domain are currently embracing multimodal model input. This provides motivation for researchers to investigate previously unexplored affective cues. Speech-based cues have traditionally received the most attention for affect prediction, however, nonverbal inputs have significant potential to increase the performance of affective computing systems and enable affect modelling in the absence of speech. Non-verbal inputs that have received little attention for continuous affect prediction include head and eye-based cues. Both head and eye-based cues are involved in emotion displays and perception. Additionally, these cues can be estimated non-intrusively from video, using computer vision tools. This work exploits this gap by comprehensively investigating head and eye-based features and their combination with speech for continuous affect prediction. Hand-crafted, automatically generated and convolutional neural network (CNN) learned features from these modalities will be investigated for continuous affect prediction. The highest performing feature set combinations will answer how effective these features are for the prediction of an individual's affective state.","PeriodicalId":193568,"journal":{"name":"2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACIIW.2019.8925042","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Continuous affect prediction involves the discrete time-continuous regression of affect dimensions. Researchers in this domain are currently embracing multimodal model input. This provides motivation for researchers to investigate previously unexplored affective cues. Speech-based cues have traditionally received the most attention for affect prediction, however, nonverbal inputs have significant potential to increase the performance of affective computing systems and enable affect modelling in the absence of speech. Non-verbal inputs that have received little attention for continuous affect prediction include head and eye-based cues. Both head and eye-based cues are involved in emotion displays and perception. Additionally, these cues can be estimated non-intrusively from video, using computer vision tools. This work exploits this gap by comprehensively investigating head and eye-based features and their combination with speech for continuous affect prediction. Hand-crafted, automatically generated and convolutional neural network (CNN) learned features from these modalities will be investigated for continuous affect prediction. The highest performing feature set combinations will answer how effective these features are for the prediction of an individual's affective state.
基于语言、头部和眼睛的连续影响预测线索
连续情感预测涉及到情感维度的离散时间-连续回归。该领域的研究人员目前正在采用多模态模型输入。这为研究人员调查以前未被探索的情感线索提供了动力。传统上,基于语言的线索在情感预测方面受到了最多的关注,然而,非语言输入在提高情感计算系统的性能和在没有语言的情况下实现情感建模方面具有巨大的潜力。在持续影响预测中很少受到关注的非语言输入包括基于头部和眼睛的线索。基于头部和眼睛的线索都涉及情感表现和感知。此外,这些线索可以使用计算机视觉工具从视频中非侵入性地估计出来。这项工作通过全面研究基于头部和眼睛的特征及其与语音的结合来利用这一差距,以进行持续的影响预测。将研究从这些模式中手工制作、自动生成和卷积神经网络(CNN)学习的特征,以进行持续的影响预测。表现最好的特征集组合将回答这些特征对预测个人情感状态的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信