J. Kessens, Mark Antonius Neerincx, R. Looije, M. Kroes, G. Bloothooft
{"title":"Perception of synthetic emotion expressions in speech: Categorical and dimensional annotations","authors":"J. Kessens, Mark Antonius Neerincx, R. Looije, M. Kroes, G. Bloothooft","doi":"10.1109/ACII.2009.5349594","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349594","url":null,"abstract":"In this paper, both categorical and dimensional annotations have been made of neutral and emotional speech synthesis (anger, fear, sad, happy and relaxed). With various prosodic emotion manipulation techniques we found emotion classification rates of 40%, which is significantly above chance level (17%). The classification rates are higher for sentences that have a semantics matching the synthetic emotion. By manipulating the pitch and duration, differences in arousal were perceived whereas differences in valence were hardly perceived. Of the investigated emotion manipulation methods, EmoFilt and EmoSpeak performed very similar, except for the emotion fear. Copy synthesis did not perform well, probably caused by suboptimal alignments and the use of multiple speakers.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123679981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cross-modal elicitation of affective experience","authors":"C. Mühl, D. Heylen","doi":"10.1109/ACII.2009.5349455","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349455","url":null,"abstract":"In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. Physiological and neurophysiological sensors assess the state of the peripheral and central nervous system. Their analysis can provide information about the state of a user. We introduce an approach to elicit emotions by audiovisual stimuli for the exploration of (neuro-)physiological correlates of affective experience. Thereby we are able to control for the affect-eliciting modality, enabling the study of general and modality-specific correlates of affective responses. We present evidence from self-reports, physiological, and neu-rophysiological data for the successful induction of the affective experiences aimed for, and thus for the validity of the elicitation approach.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122660240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Does the mood matter?","authors":"Irene Lopatovska","doi":"10.1109/ACII.2009.5349588","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349588","url":null,"abstract":"We report the results of the experiment that examined effects of mood on search performance. Participants were asked to use Google search engine to find answers to two questions. Searchers' mood was measured using the Positive Affect and Negative Affect Scale (PANAS). Search performance was measured by number of websites visited, time spent reading search results, quality of answers and other similar measures. Analysis of relationship between the mood and search performance indicated that positive mood prior to the search affected certain search behaviors, but neither positive nor negative moods had significant effect on the quality of search results.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128575841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Affect detection in the real world: Recording and processing physiological signals","authors":"J. Healey","doi":"10.1109/ACII.2009.5349496","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349496","url":null,"abstract":"Recording and processing physiological signals from real life for the purpose of affect detection presents many challenges beyond those encountered in the laboratory. Issues such as finding the proper baseline and normalization take on a time dependent meaning. Physical motion also becomes an important factor as these physiological signals often overwhelm those caused by affect. Motion also has an effect on the sensors themselves and precautions must be taken to minimize noise due to changes in placement and loss of connectivity. Ground truth collection is also discussed so that sudden events such as unexpected sounds, bumping into someone in the hallway or having a sneeze are not confused with traumatic affect. In particular, this paper focuses on these issues with respect to recording and processing: galvanic skin response; blood volume pulse; electrocardiogram; electromyogram; respiration and accelerometer signals.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129136056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"EEG analysis for implicit tagging of video data","authors":"Sander Koelstra, C. Mühl, I. Patras","doi":"10.1109/ACII.2009.5349482","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349482","url":null,"abstract":"In this work, we aim to find neuro-physiological indicators to validate tags attached to video content. Subjects are shown a video and a tag and we aim to determine whether the shown tag was congruent with the presented video by detecting the occurrence of an N400 event-related potential. Tag validation could be used in conjunction with a vision-based recognition system as a feedback mechanism to improve the classification accuracy for multimedia indexing and retrieval. An advantage of using the EEG modality for tag validation is that it is a way of performing implicit tagging. This means it can be performed while the user is passively watching the video. Independent component analysis and repeated measures ANOVA are used for analysis. Our experimental results show a clear occurrence of the N400 and a significant difference in N400 activation between matching and non-matching tags.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128016449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Clavel, A. Rilliard, Takaaki Shochi, Jean-Claude Martin
{"title":"Personality differences in the multimodal perception and expression of cultural attitudes and emotions","authors":"C. Clavel, A. Rilliard, Takaaki Shochi, Jean-Claude Martin","doi":"10.1109/ACII.2009.5349504","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349504","url":null,"abstract":"Individual differences have been reported in the literature on nonverbal communication. Recent development in the collection and evaluation of audiovisual databases of social behaviors brings new insight on these matters by exploring other types of social behaviors and other approaches to individual differences. This paper summarizes two experimental studies about personality differences in the audiovisual perception and expression of social affects. We conclude on the potential of such audiovisual database and experimental approaches for the design of personalized affective computing systems.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121597161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Social signals and the action — Cognition loop. The case of overhelp and evaluation","authors":"I. Poggi, Francesca D’Errico","doi":"10.1109/ACII.2009.5349468","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349468","url":null,"abstract":"The paper explores the action — cognition loop by investigating the relation between overhelp and evaluation. It presents a study on the helping and overhelping behaviors of teachers with students of their own vs. of a stigmatized culture, and analyses them in terms of a taxonomy of helping behavior, and adopting an annotation scheme to assess the multimodal behavior of teachers and pupils. Results show that overhelping teachers induce more negative evaluations, more often concerning general capacities, and frequently expressed indirectly. This seems to show that the overhelp offered blocks a child's striving for autonomy since it generates a negative evaluation, in particular the belief of an inability of the receiver.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133603373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Measurement of motion and emotion during musical performance","authors":"R. Benjamin Knapp, J. Jaimovich, N. Coghlan","doi":"10.1109/ACII.2009.5349469","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349469","url":null,"abstract":"This paper describes the use of physiological and kinematic sensors for the direct measurement of physical gesture and emotional changes in live musical performance. Initial studies on the measurement of performer and audience emotional state in controlled environments serve as the foundation for three pieces using the BioMuse system in live performance. By using both motion and emotion to control sound generation, the concept of integral music control has been achieved.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127873374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Lohan, Anna-Lisa Vollmer, J. Fritsch, K. Rohlfing, B. Wrede
{"title":"Which ostensive stimuli can be used for a robot to detect and maintain tutoring situations?","authors":"K. Lohan, Anna-Lisa Vollmer, J. Fritsch, K. Rohlfing, B. Wrede","doi":"10.1109/ACII.2009.5349507","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349507","url":null,"abstract":"In developmental research, tutoring behavior has been identified as scaffolding infants' learning processes. Infants seem sensitive to tutoring situations and they detect these by ostensive cues [4]. Some social signals such as eye-gaze, child-directed speech (Motherese), child-directed motion (Motionese), and contingency have been shown to serve as ostensive cues. The concept of contingency describes exchanges in which two agents interact with each other reciprocally. Csibra and Gergely argued that contingency is a characteristic ostensive stimulus of a tutoring situation [4]. In order for a robot to be treated similar to an infant, it has to both, be sensitive to the ostensive stimuli on the one hand and induce tutoring behavior by its feedback about its capabilities on the other hand. In this paper, we raise the question whether a robot can be treated similar to an infant in an interaction. We present results concerning the acceptance of a robotic agent in a social learning scenario, which we obtained via comparison to interactions with 8–11 months old infants and adults in equal conditions. We applied measurements for motion modifications (Motionese) and eye-gaze behavior. Our results reveal significant differences between Adult-Child Interaction (ACI), Adult-Adult Interaction (AAI) and Adult-Robot Interaction (ARI) suggesting that in ARI, robot-directed tutoring behavior is even more accentuated in terms of Motionese, but contingent responsivity is impaired. Our results confirm previous findings [14] concerning the differences between ACI, AAI, and ARI and constitute an important empirical basis for making use of ostensive stimuli as social signals for tutoring behavior in social robotics.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131410921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Stephen W. Gilroy, M. Cavazza, Markus Niiranen, E. André, Thurid Vogt, J. Urbain, M. Benayoun, H. Seichter, M. Billinghurst
{"title":"PAD-based multimodal affective fusion","authors":"Stephen W. Gilroy, M. Cavazza, Markus Niiranen, E. André, Thurid Vogt, J. Urbain, M. Benayoun, H. Seichter, M. Billinghurst","doi":"10.1109/ACII.2009.5349552","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349552","url":null,"abstract":"The study of multimodality is comparatively less developed for affective interfaces than for their traditional counterparts. However, one condition for the successful development of affective interface technologies is the development of frameworks for the real-time multimodal fusion. In this paper, we describe an approach to multimodal affective fusion, which relies on a dimensional model, Pleasure-Arousal-Dominance (PAD) to support the fusion of affective modalities, each input modality being represented as a PAD vector. We describe how this model supports both affective content fusion and temporal fusion within a unified approach. We report results from early user studies which confirm the existence of a correlation between measured affective input and user temperament scores.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"410 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131660494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}