2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops最新文献

筛选
英文 中文
Perception of synthetic emotion expressions in speech: Categorical and dimensional annotations 语音中综合情感表达的感知:范畴和维度注释
J. Kessens, Mark Antonius Neerincx, R. Looije, M. Kroes, G. Bloothooft
{"title":"Perception of synthetic emotion expressions in speech: Categorical and dimensional annotations","authors":"J. Kessens, Mark Antonius Neerincx, R. Looije, M. Kroes, G. Bloothooft","doi":"10.1109/ACII.2009.5349594","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349594","url":null,"abstract":"In this paper, both categorical and dimensional annotations have been made of neutral and emotional speech synthesis (anger, fear, sad, happy and relaxed). With various prosodic emotion manipulation techniques we found emotion classification rates of 40%, which is significantly above chance level (17%). The classification rates are higher for sentences that have a semantics matching the synthetic emotion. By manipulating the pitch and duration, differences in arousal were perceived whereas differences in valence were hardly perceived. Of the investigated emotion manipulation methods, EmoFilt and EmoSpeak performed very similar, except for the emotion fear. Copy synthesis did not perform well, probably caused by suboptimal alignments and the use of multiple speakers.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123679981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Cross-modal elicitation of affective experience 情感经验的跨模态引出
C. Mühl, D. Heylen
{"title":"Cross-modal elicitation of affective experience","authors":"C. Mühl, D. Heylen","doi":"10.1109/ACII.2009.5349455","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349455","url":null,"abstract":"In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. Physiological and neurophysiological sensors assess the state of the peripheral and central nervous system. Their analysis can provide information about the state of a user. We introduce an approach to elicit emotions by audiovisual stimuli for the exploration of (neuro-)physiological correlates of affective experience. Thereby we are able to control for the affect-eliciting modality, enabling the study of general and modality-specific correlates of affective responses. We present evidence from self-reports, physiological, and neu-rophysiological data for the successful induction of the affective experiences aimed for, and thus for the validity of the elicitation approach.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122660240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Does the mood matter? 心情重要吗?
Irene Lopatovska
{"title":"Does the mood matter?","authors":"Irene Lopatovska","doi":"10.1109/ACII.2009.5349588","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349588","url":null,"abstract":"We report the results of the experiment that examined effects of mood on search performance. Participants were asked to use Google search engine to find answers to two questions. Searchers' mood was measured using the Positive Affect and Negative Affect Scale (PANAS). Search performance was measured by number of websites visited, time spent reading search results, quality of answers and other similar measures. Analysis of relationship between the mood and search performance indicated that positive mood prior to the search affected certain search behaviors, but neither positive nor negative moods had significant effect on the quality of search results.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128575841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Affect detection in the real world: Recording and processing physiological signals 现实世界中的情感检测:记录和处理生理信号
J. Healey
{"title":"Affect detection in the real world: Recording and processing physiological signals","authors":"J. Healey","doi":"10.1109/ACII.2009.5349496","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349496","url":null,"abstract":"Recording and processing physiological signals from real life for the purpose of affect detection presents many challenges beyond those encountered in the laboratory. Issues such as finding the proper baseline and normalization take on a time dependent meaning. Physical motion also becomes an important factor as these physiological signals often overwhelm those caused by affect. Motion also has an effect on the sensors themselves and precautions must be taken to minimize noise due to changes in placement and loss of connectivity. Ground truth collection is also discussed so that sudden events such as unexpected sounds, bumping into someone in the hallway or having a sneeze are not confused with traumatic affect. In particular, this paper focuses on these issues with respect to recording and processing: galvanic skin response; blood volume pulse; electrocardiogram; electromyogram; respiration and accelerometer signals.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129136056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
EEG analysis for implicit tagging of video data 视频数据隐式标注的脑电分析
Sander Koelstra, C. Mühl, I. Patras
{"title":"EEG analysis for implicit tagging of video data","authors":"Sander Koelstra, C. Mühl, I. Patras","doi":"10.1109/ACII.2009.5349482","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349482","url":null,"abstract":"In this work, we aim to find neuro-physiological indicators to validate tags attached to video content. Subjects are shown a video and a tag and we aim to determine whether the shown tag was congruent with the presented video by detecting the occurrence of an N400 event-related potential. Tag validation could be used in conjunction with a vision-based recognition system as a feedback mechanism to improve the classification accuracy for multimedia indexing and retrieval. An advantage of using the EEG modality for tag validation is that it is a way of performing implicit tagging. This means it can be performed while the user is passively watching the video. Independent component analysis and repeated measures ANOVA are used for analysis. Our experimental results show a clear occurrence of the N400 and a significant difference in N400 activation between matching and non-matching tags.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128016449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 53
Personality differences in the multimodal perception and expression of cultural attitudes and emotions 文化态度和情感的多模态感知和表达中的人格差异
C. Clavel, A. Rilliard, Takaaki Shochi, Jean-Claude Martin
{"title":"Personality differences in the multimodal perception and expression of cultural attitudes and emotions","authors":"C. Clavel, A. Rilliard, Takaaki Shochi, Jean-Claude Martin","doi":"10.1109/ACII.2009.5349504","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349504","url":null,"abstract":"Individual differences have been reported in the literature on nonverbal communication. Recent development in the collection and evaluation of audiovisual databases of social behaviors brings new insight on these matters by exploring other types of social behaviors and other approaches to individual differences. This paper summarizes two experimental studies about personality differences in the audiovisual perception and expression of social affects. We conclude on the potential of such audiovisual database and experimental approaches for the design of personalized affective computing systems.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121597161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Social signals and the action — Cognition loop. The case of overhelp and evaluation 社会信号和行动-认知循环。过度帮助和评估的案例
I. Poggi, Francesca D’Errico
{"title":"Social signals and the action — Cognition loop. The case of overhelp and evaluation","authors":"I. Poggi, Francesca D’Errico","doi":"10.1109/ACII.2009.5349468","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349468","url":null,"abstract":"The paper explores the action — cognition loop by investigating the relation between overhelp and evaluation. It presents a study on the helping and overhelping behaviors of teachers with students of their own vs. of a stigmatized culture, and analyses them in terms of a taxonomy of helping behavior, and adopting an annotation scheme to assess the multimodal behavior of teachers and pupils. Results show that overhelping teachers induce more negative evaluations, more often concerning general capacities, and frequently expressed indirectly. This seems to show that the overhelp offered blocks a child's striving for autonomy since it generates a negative evaluation, in particular the belief of an inability of the receiver.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133603373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Measurement of motion and emotion during musical performance 音乐表演中运动和情绪的测量
R. Benjamin Knapp, J. Jaimovich, N. Coghlan
{"title":"Measurement of motion and emotion during musical performance","authors":"R. Benjamin Knapp, J. Jaimovich, N. Coghlan","doi":"10.1109/ACII.2009.5349469","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349469","url":null,"abstract":"This paper describes the use of physiological and kinematic sensors for the direct measurement of physical gesture and emotional changes in live musical performance. Initial studies on the measurement of performer and audience emotional state in controlled environments serve as the foundation for three pieces using the BioMuse system in live performance. By using both motion and emotion to control sound generation, the concept of integral music control has been achieved.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127873374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Which ostensive stimuli can be used for a robot to detect and maintain tutoring situations? 哪些外在刺激可以用于机器人检测和维持辅导情况?
K. Lohan, Anna-Lisa Vollmer, J. Fritsch, K. Rohlfing, B. Wrede
{"title":"Which ostensive stimuli can be used for a robot to detect and maintain tutoring situations?","authors":"K. Lohan, Anna-Lisa Vollmer, J. Fritsch, K. Rohlfing, B. Wrede","doi":"10.1109/ACII.2009.5349507","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349507","url":null,"abstract":"In developmental research, tutoring behavior has been identified as scaffolding infants' learning processes. Infants seem sensitive to tutoring situations and they detect these by ostensive cues [4]. Some social signals such as eye-gaze, child-directed speech (Motherese), child-directed motion (Motionese), and contingency have been shown to serve as ostensive cues. The concept of contingency describes exchanges in which two agents interact with each other reciprocally. Csibra and Gergely argued that contingency is a characteristic ostensive stimulus of a tutoring situation [4]. In order for a robot to be treated similar to an infant, it has to both, be sensitive to the ostensive stimuli on the one hand and induce tutoring behavior by its feedback about its capabilities on the other hand. In this paper, we raise the question whether a robot can be treated similar to an infant in an interaction. We present results concerning the acceptance of a robotic agent in a social learning scenario, which we obtained via comparison to interactions with 8–11 months old infants and adults in equal conditions. We applied measurements for motion modifications (Motionese) and eye-gaze behavior. Our results reveal significant differences between Adult-Child Interaction (ACI), Adult-Adult Interaction (AAI) and Adult-Robot Interaction (ARI) suggesting that in ARI, robot-directed tutoring behavior is even more accentuated in terms of Motionese, but contingent responsivity is impaired. Our results confirm previous findings [14] concerning the differences between ACI, AAI, and ARI and constitute an important empirical basis for making use of ostensive stimuli as social signals for tutoring behavior in social robotics.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131410921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
PAD-based multimodal affective fusion 基于pad的多模态情感融合
Stephen W. Gilroy, M. Cavazza, Markus Niiranen, E. André, Thurid Vogt, J. Urbain, M. Benayoun, H. Seichter, M. Billinghurst
{"title":"PAD-based multimodal affective fusion","authors":"Stephen W. Gilroy, M. Cavazza, Markus Niiranen, E. André, Thurid Vogt, J. Urbain, M. Benayoun, H. Seichter, M. Billinghurst","doi":"10.1109/ACII.2009.5349552","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349552","url":null,"abstract":"The study of multimodality is comparatively less developed for affective interfaces than for their traditional counterparts. However, one condition for the successful development of affective interface technologies is the development of frameworks for the real-time multimodal fusion. In this paper, we describe an approach to multimodal affective fusion, which relies on a dimensional model, Pleasure-Arousal-Dominance (PAD) to support the fusion of affective modalities, each input modality being represented as a PAD vector. We describe how this model supports both affective content fusion and temporal fusion within a unified approach. We report results from early user studies which confirm the existence of a correlation between measured affective input and user temperament scores.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"410 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131660494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信