Emotion and attention: predicting electrodermal activity through video visual descriptors

Alex Hernández-García, F. Martínez, F. Díaz-de-María
{"title":"Emotion and attention: predicting electrodermal activity through video visual descriptors","authors":"Alex Hernández-García, F. Martínez, F. Díaz-de-María","doi":"10.1145/3106426.3109418","DOIUrl":null,"url":null,"abstract":"This paper contributes to the field of affective video content analysis through the novel employment of electrodermal activity (EDA) measurements as ground truth for machine learning algorithms. The variation of the electrical properties of the skin, known as EDA, is a psychophysiological indicator widely used in medicine, psychology and neuroscience which can be considered a somatic marker of the emotional and attentional reaction of subjects towards stimuli. One of its main advantages is that the recorded information is not biased by the cognitive process of giving an opinion or a score to characterize the subjective perception. In this work, we predict the levels of emotion and attention, derived from EDA records, by means of a small set of low-level visual descriptors computed from the video stimuli. Linear regression experiments show that our descriptors predict significantly well the sum of emotion and attention levels, reaching a coefficient of determination R2 = 0.25. This result sets a promising path for further research on the prediction of emotion and attention from videos using EDA.","PeriodicalId":20685,"journal":{"name":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Web Intelligence, Mining and Semantics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3106426.3109418","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

This paper contributes to the field of affective video content analysis through the novel employment of electrodermal activity (EDA) measurements as ground truth for machine learning algorithms. The variation of the electrical properties of the skin, known as EDA, is a psychophysiological indicator widely used in medicine, psychology and neuroscience which can be considered a somatic marker of the emotional and attentional reaction of subjects towards stimuli. One of its main advantages is that the recorded information is not biased by the cognitive process of giving an opinion or a score to characterize the subjective perception. In this work, we predict the levels of emotion and attention, derived from EDA records, by means of a small set of low-level visual descriptors computed from the video stimuli. Linear regression experiments show that our descriptors predict significantly well the sum of emotion and attention levels, reaching a coefficient of determination R2 = 0.25. This result sets a promising path for further research on the prediction of emotion and attention from videos using EDA.
情绪与注意:透过视像描述语预测皮肤电活动
本文通过新颖地使用皮肤电活动(EDA)测量作为机器学习算法的基础真理,为情感视频内容分析领域做出了贡献。皮肤电特性的变化被称为EDA,是一种广泛应用于医学、心理学和神经科学的心理生理指标,可以被认为是受试者对刺激的情绪和注意力反应的躯体标记。它的主要优点之一是,记录的信息不受给出意见或评分来表征主观感知的认知过程的影响。在这项工作中,我们通过从视频刺激中计算出的一小组低级视觉描述符来预测来自EDA记录的情绪和注意力水平。线性回归实验表明,我们的描述符可以很好地预测情绪和注意力水平的总和,达到决定系数R2 = 0.25。这一结果为进一步研究EDA对视频情绪和注意力的预测奠定了基础。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信