2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops最新文献

筛选
英文 中文
Emotional brain-computer interfaces 情感脑机接口
G. G. Molina, T. Tsoneva, A. Nijholt
{"title":"Emotional brain-computer interfaces","authors":"G. G. Molina, T. Tsoneva, A. Nijholt","doi":"10.1504/IJAACS.2013.050687","DOIUrl":"https://doi.org/10.1504/IJAACS.2013.050687","url":null,"abstract":"Research in Brain-Computer Interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide significant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the influence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates. These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of non-invasive EEG based BCIs.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"37 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131726745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 146
Simulation of the dynamics of virtual characters' emotions and social relations 模拟虚拟人物的动态情感和社会关系
M. Ochs, N. Sabouret
{"title":"Simulation of the dynamics of virtual characters' emotions and social relations","authors":"M. Ochs, N. Sabouret","doi":"10.1109/ACII.2009.5349428","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349428","url":null,"abstract":"One of the main challenges is to give life to believable virtual characters. Research shows that emotions and social relations, closely related, play a key role in determining the behavior of individuals. In order to improve the believ-ability of virtual characters' behavior, we propose in this article a method to compute virtual characters emotions based on attitudes and a model of their influence on the dynamics of social relations. Based on this work, a tool aiming at the simulation of the evolution of emotions and social relations of virtual characters have been implemented.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130404311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Differentiated semantic analysis in lexical affect sensing 词汇情感感知中的差异化语义分析
Alexander Osherenko, E. André
{"title":"Differentiated semantic analysis in lexical affect sensing","authors":"Alexander Osherenko, E. André","doi":"10.1109/ACII.2009.5349560","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349560","url":null,"abstract":"There has been considerable interest in the recognition of affect from written and spoken language. In this paper, we describe an approach to lexical affect sensing that performs a semantic analysis of texts utilizing comprehensive grammatical information. Hereby, the proposed approach differentiates affect of many classes. In addition, this paper reports on obtained results and discusses them.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129488539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Musicology's dialogue with emotion studies: Analysing musical structure 音乐学与情感研究的对话:音乐结构分析
M. Spitzer
{"title":"Musicology's dialogue with emotion studies: Analysing musical structure","authors":"M. Spitzer","doi":"10.1109/ACII.2009.5349502","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349502","url":null,"abstract":"This paper discusses the relationship of musicology with the field of emotional studies. In so doing, four methods of analysis are proposed that enable the mediation of musical emotion in the analysis of musical pieces.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132702343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
“I can feel it too!”: Emergent empathic reactions between synthetic characters “我也能感觉到!”:合成角色之间的突发共情反应
Sérgio Hortas Rodrigues, S. Mascarenhas, João Dias, Ana Paiva
{"title":"“I can feel it too!”: Emergent empathic reactions between synthetic characters","authors":"Sérgio Hortas Rodrigues, S. Mascarenhas, João Dias, Ana Paiva","doi":"10.1109/ACII.2009.5349570","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349570","url":null,"abstract":"Empathy is often seen as the capacity to perceive, understand and experience others' emotions. This concept has been incorporated in virtual agents to achieve better believability, social interaction and user engagement. However, this has been mostly done to achieve empathic relations with the users. Instead, in this article we focus on empathy between synthetic characters and propose an analytical approach that consists in a generic computational model of empathy, supported by recent neuropsychological studies. The proposed model of empathy was implemented into an affective agent architecture. To evaluate the implementation a small scenario was defined and we asked a group of users to visualize it with the empathy model and another group to visualize it without the model. The results obtained confirmed that our model was capable of producing significant effects in the perception of the emergent empathic responses.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"127 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132412210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
How about laughter? Perceived naturalness of two laughing humanoid robots 那笑声呢?感知两个人形机器人笑的自然度
C. Becker-Asano, T. Kanda, C. Ishi, H. Ishiguro
{"title":"How about laughter? Perceived naturalness of two laughing humanoid robots","authors":"C. Becker-Asano, T. Kanda, C. Ishi, H. Ishiguro","doi":"10.1109/ACII.2009.5349371","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349371","url":null,"abstract":"As humanoid robots will have to behave socially adequate in a future society, we started to investigate laughter as an important para-verbal signal influencing relationships among humans quite easily. As a first step we investigate, how humanoid robots might laugh within a situation, which is suitable for laughter. Given the variety of human laughter, do people prefer a certain style for a robot's laughter? And if yes, how does a robot's outer appearance affect this preference, if at all? Accordingly, we combined six recordings of female laughter with body movements for two different humanoid robots with the aim to evaluate their perceived naturalness using two types of video-based surveys. We not only found that people indeed prefer one type of laughter when being forced to choose, but the results also suggest significant differences in the perceived naturalness of laughter with regard to the participant's cultural background. The outer appearance seems to change the perceived naturalness of a humanoid robot's laughter only on a global level. It is evident, however, that further research on this rather unexplored topic is needed as much as it promises to provide valuable means to support the development of social robots.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131346231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
OpenEAR — Introducing the munich open-source emotion and affect recognition toolkit OpenEAR -介绍慕尼黑开源情感和情感识别工具包
F. Eyben, M. Wöllmer, Björn Schuller
{"title":"OpenEAR — Introducing the munich open-source emotion and affect recognition toolkit","authors":"F. Eyben, M. Wöllmer, Björn Schuller","doi":"10.1109/ACII.2009.5349350","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349350","url":null,"abstract":"Various open-source toolkits exist for speech recognition and speech processing. These toolkits have brought a great benefit to the research community, i.e. speeding up research. Yet, no such freely available toolkit exists for automatic affect recognition from speech. We herein introduce a novel open-source affect and emotion recognition engine, which integrates all necessary components in one highly efficient software package. The components include audio recording and audio file reading, state-of-the-art paralinguistic feature extraction and plugable classification modules. In this paper we introduce the engine and extensive baseline results. Pre-trained models for four affect recognition tasks are included in the openEAR distribution. The engine is tailored for multi-threaded, incremental on-line processing of live input in real-time, however it can also be used for batch processing of databases.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114031391","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 421
The action synergies: Building blocks for understanding human behavior 行动协同:理解人类行为的基石
Yi Li, Y. Aloimonos
{"title":"The action synergies: Building blocks for understanding human behavior","authors":"Yi Li, Y. Aloimonos","doi":"10.1109/ACII.2009.5349506","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349506","url":null,"abstract":"Social signal processing is an emerging field that gains more and more attention. As a key element in the field, visual perception of human motion is important for understanding human behavior in social intelligence. Motivated by the hypothesis of muscle synergies, we proposed action synergies for automatically partitioning human motion into individual action segments in videos. Assuming the size of the human subject is reasonable and the background changes smoothly, the video sequence is represented by six latent variables, which we obtain using Gaussian Process Dynamical Models (GPDM). For each variable, the third order derivative and its local maxima are computed. Then by finding the consistent local maxima in all variables, the video is partitioned into action segments. We demonstrate the usefulness of the algorithm for periodic motion patterns as well as non-periodic ones, using videos of various qualities. Results show that the proposed algorithm partitions videos into meaningful action segments.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114405602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Relations between facial display, eye gaze and head tilt: Dominance perception variations of virtual agents 面部表情、眼睛注视和头部倾斜的关系:虚拟代理的支配知觉变化
Nikolaus Bee, Stefan Franke, E. André
{"title":"Relations between facial display, eye gaze and head tilt: Dominance perception variations of virtual agents","authors":"Nikolaus Bee, Stefan Franke, E. André","doi":"10.1109/ACII.2009.5349573","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349573","url":null,"abstract":"In this paper, we focus on facial displays, eye gaze and head tilts to express social dominance. In particular, we are interested in the interaction of different non-verbal cues. We present a study which systematically varies eye gaze and head tilts for five basic emotions and a neutral state using our own graphics and animation engine. The resulting images are then presented to a large number of subjects via a Web-based interface who are asked to attribute dominance values to the character shown in the images. First, we analyze how dominance ratings are influenced by the conveyed emotional facial expression. Further, we investigate how gaze direction and head pose influence dominance perception depending on the displayed emotional state.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116825644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Game adaptivity impact on affective physical interaction 游戏适应性对情感身体互动的影响
Georgios N. Yannakakis
{"title":"Game adaptivity impact on affective physical interaction","authors":"Georgios N. Yannakakis","doi":"10.1109/ACII.2009.5349384","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349384","url":null,"abstract":"Adaptive human computer interaction is necessary for successfully closing the affective loop within intelligent interactive systems. This paper investigates the impact of adaptivity on the physiological state and the expressed emotional preferences of users. A physical interactive game is used as a test-bed system and its real-time adaptation mechanism is evaluated using a survey experiment. Results reveal that entertainment preferences expressed are consistent with the affective model constructed and that adaptation generates dissimilar physiological responses with respect to preferences.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114691235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信