2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops最新文献

筛选
英文 中文
You make me happy: Using an adaptive affective interface to investigate the effect of social presence on positive emotion induction 你让我开心:运用适应性情感界面研究社交在场对积极情绪诱导的影响
S. Shahid, E. Krahmer, M. Swerts, Willem A. Melder, Mark Antonius Neerincx
{"title":"You make me happy: Using an adaptive affective interface to investigate the effect of social presence on positive emotion induction","authors":"S. Shahid, E. Krahmer, M. Swerts, Willem A. Melder, Mark Antonius Neerincx","doi":"10.1109/ACII.2009.5349355","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349355","url":null,"abstract":"Affective user interfaces are usually characterized as interfaces that try to recognize, interpret and respond to human emotions. In this paper, we take a different perspective and investigate if and how a digital, interactive adaptive mirror, which is a game-like affective interface, can induce positive emotions in participants and how the social presence of a friend affects the emotion induction. Results show that participants systematically feel more positive after an affective mirror session and co-presence of a friend is shown to boost this effect.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"465 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124495335","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Rapport and facial expression 融洽和面部表情
Ning Wang, J. Gratch
{"title":"Rapport and facial expression","authors":"Ning Wang, J. Gratch","doi":"10.1109/ACII.2009.5349514","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349514","url":null,"abstract":"How to build virtual agents that establish rapport with human? According to Tickle-Degnen and Rosenthal [4], the three essential components of rapport are mutual attentiveness, positivity and coordination. In our previous work, we designed an embodied virtual agent to establish rapport with a human speaker by providing rapid and contingent nonverbal feedback [13] [22]. How do we know that a human speaker is feeling a sense of rapport? In this paper, we focus on the positivity component of rapport by investigating the relationship of human speakers' facial expressions on the establishment of rapport. We used an automatic facial expression coding tool called CERT to analyze the human dyad interactions and human-virtual human interactions. Results show that recognizing positive facial displays alone may be insufficient and that recognized negative facial displays was more diagnostic in assessing the level of rapport between participants.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116986656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
Emotion and music: A view from the cultural psychology of music 情感与音乐:音乐文化心理学视角
Nicola Dibben
{"title":"Emotion and music: A view from the cultural psychology of music","authors":"Nicola Dibben","doi":"10.1109/ACII.2009.5349474","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349474","url":null,"abstract":"This paper provides an overview of current thinking in music cognition regarding the perception of emotion in music. A componential view of emotion is adopted, and a variety of routes by which music expresses emotion are presented. Two main questions for future research are identified: first, the extent to which perception and induction of emotion through music is shared cross-culturally, and second, identification of the factors that contribute to the cross-cultural perception of emotion in music. By drawing upon a biologically and ecologically informed perspective this paper aims to identify routes for future research that would enable music cognition research to shed light on the socio-historical variability of emotion perception through music.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117038163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Transmission of vocal emotion: Do we have to care about the listener? The case of the Italian speech corpus EMOVO 声音情感的传递:我们一定要关心听者吗?意大利语语音语料库EMOVO的案例分析
C. Giovannella, Davide Conflitti, R. Santoboni, A. Paoloni
{"title":"Transmission of vocal emotion: Do we have to care about the listener? The case of the Italian speech corpus EMOVO","authors":"C. Giovannella, Davide Conflitti, R. Santoboni, A. Paoloni","doi":"10.1109/ACII.2009.5349564","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349564","url":null,"abstract":"The evaluation of emotionally colored non-sense sentences contained in the Italian vocal database EMOVO has been performed by means of a new testing tool based on the Plutchick's finite stated model of emotions. The validation of the corpus has been performed by taking into account also the ability of the listeners to recognize a given emotion. Such a detailed analysis allowed us to identify the unreliable listeners and to operate a more accurate assessment of the vocal database and of the speakers.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122106069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
The importance of the body in affect-modulated action selection: A case study comparing proximal versus distal perception in a prey-predator scenario 身体在影响调节行为选择中的重要性:一个比较近端和远端感知在捕食者-猎物情景中的案例研究
C. O'Bryne, L. Cañamero, J. Murray
{"title":"The importance of the body in affect-modulated action selection: A case study comparing proximal versus distal perception in a prey-predator scenario","authors":"C. O'Bryne, L. Cañamero, J. Murray","doi":"10.1109/ACII.2009.5349596","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349596","url":null,"abstract":"In the context of the animat approach, we investigate the effect of an emotion-like hormonal mechanism, as a modulator of perception - and second order controller to an underlying motivation-based action selection architecture - on brain-body-environment interactions within a prey-predator scenario. We are particularly interested in the effects that affective modulation of different perceptual capabilities has on the dynamics of interactions between predator and prey, as part of a broader study of the adaptive value of emotional states such as ¿fear¿ and ¿aggression¿ in the context of action selection. In this paper we present experiments where we modulated the architecture of a prey robot using two different types of sensory capabilities, proximal and distal, effectively creating combinations of different prey ¿brains¿ and ¿bodies¿.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125877428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Creating emotional communication with interactive artwork 用互动艺术品创造情感交流
M. Iacobini, T. Gonsalves, N. Bianchi-Berthouze, C. Frith
{"title":"Creating emotional communication with interactive artwork","authors":"M. Iacobini, T. Gonsalves, N. Bianchi-Berthouze, C. Frith","doi":"10.1109/ACII.2009.5349546","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349546","url":null,"abstract":"This pilot study contributes to the building of an art installation that aims to build an emotional communication loop with the audience. It will do so by reacting to or mimicking the audience's changes in emotional expressions according to emotional contagion dynamics. The study aims to inform the project by gaining a better understanding of emotional contagion patterns and the factors that may affect how people emotionally engage in this art context. The analysis of our early experiments shows reflex mechanisms of facial expression mimicry and counter-mimicry that follow patterns similar to those reported in the psychology literature. In fact, automatic mimicry and counter-mimicry correlated to some extent to whether or not the audience felt to be interacting with a real person. Furthermore, the results indicate that individual differences play a role in the way people can emotionally engage with this type of artwork. However, irrespective of these differences, the interaction led the audience to introspect and reflect about emotions.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"2016 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126006788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Understanding affective interaction: Emotion, engagement, and internet videos 理解情感互动:情感、参与和网络视频
Shaowen Bardzell, Jeffrey Bardzell, Tyler M. Pace
{"title":"Understanding affective interaction: Emotion, engagement, and internet videos","authors":"Shaowen Bardzell, Jeffrey Bardzell, Tyler M. Pace","doi":"10.1109/ACII.2009.5349551","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349551","url":null,"abstract":"As interest in experience and affect in HCI continues to grow, particularly with regard to social media and Web 2.0 technologies, research on techniques for evaluating user engagement is needed. This paper presents a study of popular Internet videos involving a mixed method approach to user engagement. Instruments included physiological measures, emotional self-report measures, and personally expressive techniques, such as open-ended prose reviews. Using triangulation to interpret the results, we express relationships among perceived emotion, experienced emotion, video preference, and contextual factors.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"174 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129578789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Gesture and emotion: Can basic gestural form features discriminate emotions? 手势和情绪:基本的手势形式特征能区分情绪吗?
Michael Kipp, Jean-Claude Martin
{"title":"Gesture and emotion: Can basic gestural form features discriminate emotions?","authors":"Michael Kipp, Jean-Claude Martin","doi":"10.1109/ACII.2009.5349544","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349544","url":null,"abstract":"The question how exactly gesture and emotion are interrelated is still sparsely covered in research, yet highly relevant for building affective artificial agents. In our study, we investigate how basic gestural form features (handedness, hand shape, palm orientation and motion direction) are related to components of emotion. We argue that material produced by actors in filmed theater stagings are particularly well suited for such analyses. Our results indicate that there may be a universal association of gesture handedness with the emotional dimensions of pleasure and arousal. We discuss this and more specific findings, and conclude with possible implications and applications of our study.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"94 37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129096342","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 122
The emotion slider: A self-report device for the continuous measurement of emotion 情绪滑块:一种持续测量情绪的自我报告装置
Gaël Laurans, P. Desmet, P. Hekkert
{"title":"The emotion slider: A self-report device for the continuous measurement of emotion","authors":"Gaël Laurans, P. Desmet, P. Hekkert","doi":"10.1109/ACII.2009.5349539","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349539","url":null,"abstract":"The emotion slider is a device developed to collect self-reports of the valence of users' experiences with interactive systems based on recent theories on the embodiment of emotion and basic approach/avoidance behavioral tendencies. To test it, participants (N = 51) watched 10 positive and 10 negative slides from the International affective picture system while using the emotion slider in two different ways: pushing the handle to report positive feelings and pulling it for negative feelings in one condition (incongruent condition) and pushing the handle to report negative feelings and pulling it for positive feelings in the other (congruent condition). Response times were significantly different between the two usage conditions but the direction of this difference did not conform to the prediction. Shorter response time was associated with fewer errors. The conclusion describes some implications for human-computer interaction research.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131026012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
Demo: Recording emotions with “MyInnerLife” 演示:用“MyInnerLife”记录情绪
Elisabeth Eichhorn, Reto Wettach, Boris Müller
{"title":"Demo: Recording emotions with “MyInnerLife”","authors":"Elisabeth Eichhorn, Reto Wettach, Boris Müller","doi":"10.1109/ACII.2009.5349520","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349520","url":null,"abstract":"Our project is a system to express emotions and record them on a long-term basis. In contrast to a lot of the research in the field of affective computing our project is not dedicated to enable machines to detect human emotions but to allow new input methods. This demo presents 'MyInnerLife', a physical input device to express and record emotions non-verbally.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132386254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信