在MAUI范式中使用生物传感器对自主神经系统(ANS)情绪信号进行分类

Christine L. Lisetti, Fatma Nasoz
{"title":"在MAUI范式中使用生物传感器对自主神经系统(ANS)情绪信号进行分类","authors":"Christine L. Lisetti, Fatma Nasoz","doi":"10.1109/ROMAN.2006.314430","DOIUrl":null,"url":null,"abstract":"In this article, we discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user-modeling. We introduce the overall paradigm for our multi-modal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and possible applications of emotion recognition for multimodal intelligent systems","PeriodicalId":254129,"journal":{"name":"ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Categorizing Autonomic Nervous System (ANS) Emotional Signals using Bio-Sensors for HRI within the MAUI Paradigm\",\"authors\":\"Christine L. Lisetti, Fatma Nasoz\",\"doi\":\"10.1109/ROMAN.2006.314430\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this article, we discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user-modeling. We introduce the overall paradigm for our multi-modal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and possible applications of emotion recognition for multimodal intelligent systems\",\"PeriodicalId\":254129,\"journal\":{\"name\":\"ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.2006.314430\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2006.314430","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

摘要

在本文中,我们讨论了情感和认知之间的密切关系,以及情感在多模态人机交互(HCI)和用户建模中的重要性。我们介绍了我们的多模态系统的总体范式,旨在识别用户的情绪,并根据当前的上下文或应用程序相应地对它们做出响应。然后,我们描述了情绪激发实验的设计,我们通过可穿戴电脑收集来自自主神经系统的生理信号(皮肤电反应、心率、体温),并将它们映射到特定的情绪(悲伤、愤怒、恐惧、惊讶、沮丧和娱乐)。我们展示了三种不同的监督学习算法的结果,这些算法根据情绪对这些收集到的信号进行分类,并将它们的学习推广到从新的信号集合中识别情绪。最后,我们讨论了情感识别对多模态智能系统可能产生的更广泛的影响和可能的应用
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Categorizing Autonomic Nervous System (ANS) Emotional Signals using Bio-Sensors for HRI within the MAUI Paradigm
In this article, we discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user-modeling. We introduce the overall paradigm for our multi-modal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and possible applications of emotion recognition for multimodal intelligent systems
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信