S. Steidl, A. Batliner, Björn Schuller, Dino Seppi
{"title":"The hinterland of emotions: Facing the open-microphone challenge","authors":"S. Steidl, A. Batliner, Björn Schuller, Dino Seppi","doi":"10.1109/ACII.2009.5349499","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349499","url":null,"abstract":"We first depict the challenge to address all non-prototypical varieties of emotional states signalled in speech in an open microphone setting, i. e. using all data recorded. In the remainder of the article, we illustrate promising strategies, using the FAU Aibo emotion corpus, by showing different degrees of classification performance for different degrees of prototypicality, and by elaborating on the use of ROC curves, classification confidences, and the use of correlation-based analyses.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115245615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Soleymani, Joep J. M. Kierkels, G. Chanel, T. Pun
{"title":"A Bayesian framework for video affective representation","authors":"M. Soleymani, Joep J. M. Kierkels, G. Chanel, T. Pun","doi":"10.1109/ACII.2009.5349563","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349563","url":null,"abstract":"Emotions that are elicited in response to a video scene contain valuable information for multimedia tagging and indexing. The novelty of this paper is to introduce a Bayesian classification framework for affective video tagging that allows taking contextual information into account. A set of 21 full length movies was first segmented and informative content-based features were extracted from each shot and scene. Shots were then emotionally annotated, providing ground truth affect. The arousal of shots was computed using a linear regression on the content-based features. Bayesian classification based on the shots arousal and content-based features allowed tagging these scenes into three affective classes, namely calm, positive excited and negative excited. To improve classification accuracy, two contextual priors have been proposed: the movie genre prior, and the temporal dimension prior consisting of the probability of transition between emotions in consecutive scenes. The f1 classification measure of 54.9% that was obtained on three emotional classes with a naïve Bayes classifier was improved to 63.4% after utilizing all the priors.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116658086","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jean-Baptiste Dodane, Takatsugu Hirayama, H. Kawashima, T. Matsuyama
{"title":"Estimation of user interest using time delay features between proactive content presentation and eye movements","authors":"Jean-Baptiste Dodane, Takatsugu Hirayama, H. Kawashima, T. Matsuyama","doi":"10.1109/ACII.2009.5349258","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349258","url":null,"abstract":"Human-machine interaction still lacks smoothness and naturalness despite the widespread utilization of intelligent systems and emotive agents. In order to improve the interaction, this work proposes an approach to estimate user's interest based on the relationships between dynamics of user's eye movements, more precisely the endogenous control mode of saccades, and machine's proactive visual content presentation. Under a specially-designed presentation phase to make the user express the endogenous saccades, we analyzed delays between the saccades and the presentation events. As a result, we confirmed that the delay while the user's gaze is maintained on the previous presented content regardless of the next event, called resistance, is a good indicator of the interest estimation (70% success, upon 20 experiments). It showed higher accuracy than the conventional interest estimation based on gaze duration.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131104871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Understanding behavioral problems in text-based communication using neuroscientific perspective","authors":"D. Gokcay, S. Arikan, Gülsen Yildirim","doi":"10.1109/ACII.2009.5349572","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349572","url":null,"abstract":"In face-to-face communication, humans handle a variety of inputs in addition to the target content. Many affective clues such as facial expressions, body postures, and characteristics of speech, environmental sensory inputs, and even the mood of the interacting parties influence the overall meaning extracted from communication. However, text-based computer mediated communication (i.e., instant messaging, email, chat) generally exhibit poor media content in terms of these inputs. In particular, peers communicating through computer mediated communication (CMC) are usually prone to make wrong emotional judgments. Because of the tight connectivity of emotion and cognition, emotional judgment errors cause errors in the perception of the received message and shift behavioral preference toward fearless, disinherited, aggressive, and deceptive content in the responses. In this study, we are putting forward a cognitive neuroscience perspective to show the similarity between the behavioral problems brought by the text-based CMC platforms and cognitive and emotional behavioral problems exhibited by brain damaged patient populations. We present brief examples of behavioral deficits observed in amygdala and/or orbito-frontal cortex (OFC) damaged patients and show that these deficits bear striking similarities with those in text-based CMC platforms. While we consider ourselves to communicate similarly in face-to-face and computerized text-based environments, our brains produce dissimilar cognitive input and output in these two separate environments. Our conclusion is: when the communication problems introduced by the limited social cues in email and chat are seen in the light of the neurology perspective, developing solutions for these problems will become a priority issue.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128592042","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Assessing the validity of a computational model of emotional coping","authors":"S. Marsella, J. Gratch, Ning Wang, B. Stankovic","doi":"10.1109/ACII.2009.5349584","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349584","url":null,"abstract":"In this paper we describe the results of a rigorous empirical study evaluating the coping responses of a computational model of emotion. We discuss three key kinds of coping, Wishful Thinking, Resignation and Distancing that impact an agent's beliefs, intentions and desires, and compare these coping responses to related work in the attitude change literature. We discuss the EMA computational model of emotion and identify several hypotheses it makes concerning these coping processes. We assess these hypotheses against the behavior of human subjects playing a competitive board game, using monetary gains and losses to induce emotion and coping. Subject's appraisals, emotional state and coping responses were indexed at key points throughout a game, revealing a pattern of subject's al-tering their beliefs, desires and intentions as the game unfolds. The results clearly support several of the hypotheses on coping responses but also identify (a) extensions to how EMA models Wishful Thinking as well as (b) individual differences in subject's coping responses.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122081185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thurid Vogt, E. André, J. Wagner, Stephen W. Gilroy, Fred Charles, M. Cavazza
{"title":"Real-time vocal emotion recognition in artistic installations and interactive storytelling: Experiences and lessons learnt from CALLAS and IRIS","authors":"Thurid Vogt, E. André, J. Wagner, Stephen W. Gilroy, Fred Charles, M. Cavazza","doi":"10.1109/ACII.2009.5349501","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349501","url":null,"abstract":"Most emotion recognition systems still rely exclusively on prototypical emotional vocal expressions that may be uniquely assigned to a particular class. In realistic applications, there is, however, no guarantee that emotions are expressed in a prototypical manner. In this paper, we report on challenges that arise when coping with non-prototypical emotions in the context of the CALLAS project and the IRIS network. CALLAS aims to develop interactive art installations that respond to the multimodal emotional input of performers and spectators in real-time. IRIS is concerned with the development of novel technologies for interactive storytelling. Both research initiatives represent an extreme case of non-prototypicality since neither the stimuli nor the emotional responses to stimuli may be considered as prototypical.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"293 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122822424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GraphLaugh: A tool for the interactive generation of humorous puns","authors":"A. Valitutti, O. Stock, C. Strapparava","doi":"10.1109/ACII.2009.5349529","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349529","url":null,"abstract":"While automatic generation of funny texts delivers incrementally better results, for the time being semiautomatic generation can already provide something useful. In particular, we present an interactive system for producing humorous puns obtained through variation (i.e., word substitution) performed on familiar expressions. The replacement word is selected according to phonetic similarity and semantic constraints expressing semantic opposition or evoking ridiculous traits of people. Examples of these puns are Chaste makes waste (variation on proverb) and Genital Hospital (variation on soap opera title). Lexical substitution is the humorous core on which funniness of pun is based. We implemented an interactive tool (called GraphLaugh) that can automatically generate different type of lexical associations and visualize them through a dynamic graph. Through the interaction with the network nodes and arcs, the user can control the selection of words, semantic associations and familiar expressions. In this way, a restrict set of familiar expressions are filtered, the best word substitutions to apply them are easily individuated, and finally a list of funny puns is created.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114338234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AffectButton: Towards a standard for dynamic affective user feedback","authors":"J. Broekens, Willem-Paul Brinkman","doi":"10.1109/ACII.2009.5349347","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349347","url":null,"abstract":"Emotions are an important aspect of life. Persons have emotions while using products and technology. It is becoming more and more important to be able to assess these emotions for multiple reasons: (a) to develop better products, (b) to better understand how the user interacts with products, and (c) because the affective state of the user is of importance to the product itself (e.g., in the case of social software, persuasive computing, recommendation). In general there are two ways of extracting affective user feedback: explicit and implicit. Here we present a new interface component that enables users to give explicit affective feedback in a flexible and dynamic way. We call this component the AffectButton. Based on statistical analysis of affective user input gathered with the AffectButton in three user studies, we present evidence that users can use the button effectively to enter affective feedback. Furthermore, the feedback is reliable and valid.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116044849","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Natural affect data — Collection & annotation in a learning context","authors":"S. Afzal, P. Robinson","doi":"10.1109/ACII.2009.5349537","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349537","url":null,"abstract":"Automatic inference of affect relies on representative data. For viable applications of such technology the use of naturalistic over posed data has been increasingly emphasised. Creating a repository of naturalistic data is however a massively challenging task. We report results from a data collection exercise in one of the most significant application areas of affective computing, namely computer-based learning environments. The conceptual and methodological issues encountered during the process are discussed, and problems with labelling and annotation are identified. A comparison of the compiled database with some standard databases is also presented.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121642537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Moritz Lehne, K. Ihme, A. Brouwer, J. V. van Erp, T. Zander
{"title":"Error-related EEG patterns during tactile human-machine interaction","authors":"Moritz Lehne, K. Ihme, A. Brouwer, J. V. van Erp, T. Zander","doi":"10.1109/ACII.2009.5349480","DOIUrl":"https://doi.org/10.1109/ACII.2009.5349480","url":null,"abstract":"Recently, the use of brain-computer interfaces (BCIs) has been extended from active control to passive detection of cognitive user states. These passive BCI systems can be especially useful for automatic error detection in human-machine systems by recording EEG potentials related to human error processing. Up to now, these so-called error potentials have only been observed in the visual and auditory modality. However, new interfaces making use of the tactile sensory modality for conveying information to the user are on the rise. The present study aims at investigating the feasibility of BCI error detection during tactile human-machine interaction. Therefore, an experiment was conducted where EEG was measured while participants interacted with a tactile interface. During this interaction, errors of the user as well as of the interface were induced. It was shown that EEG patterns after erroneous behavior — either of the user or of the interface — significantly differed from patterns after correct responses.","PeriodicalId":330737,"journal":{"name":"2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134012723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}