{"title":"Towards affect detection during human-technology interaction: An empirical study using a combined EEG and fNIRS approach","authors":"K. Pollmann, Mathias Vukelić, M. Peissner","doi":"10.1109/ACII.2015.7344649","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344649","url":null,"abstract":"The present Ph. D. project explores possibilities to apply neurophysiological methods for affect detection during human-technology interaction (HTI). Portable neurophysio-logical methods such as electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offer an objective, ecologically valid and rather convenient way to infer the user's affective state through the monitoring of brain activity. To identify neural signatures for positive and negative affective user reactions an empirical study is proposed. The experimental design of this study enables synchronous data acquisition for EEG, fNIRS and psychophysiological measurements while the user is interacting with an adaptive web-interface. During the interaction process positive and negative affective states are induced by system-generated adaptive actions which are either appropriate and helpful or inappropriate and impedimental. The findings of the empirical study shed light into the question whether EEG, fNIRS or a hybrid approach that combines the employed methods is most reliable for affect detection during HTI.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"1 1","pages":"726-732"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77439815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Warmth in affective mediated interaction: Exploring the effects of physical warmth on interpersonal warmth","authors":"Christian J. A. M. Willemse, D. Heylen, J. V. Erp","doi":"10.1109/ACII.2015.7344547","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344547","url":null,"abstract":"Recent research suggests that physical warmth activates perceptions of metaphorical interpersonal warmth and closeness, and increases pro-social behavior. These effects are grounded in our earliest intimate experiences: being held by our loving caregivers. These findings provide reasons to incorporate warmth in devices for distant affective communication, which could simulate one's body heat. An experiment was carried out to gain a better understanding of the implications of physical warmth for mediated social interaction. Moreover, we aimed at disentangling effects of social warmth (body temperature) from effects of non-social warmth (artificial heat sources and ambient temperature). Except for an increase in perceptions of metaphorical warmth as a consequence of higher ambient temperature, no effects were found. We use our study to pinpoint the caveats and challenges that research into warmth in affective mediated interaction faces.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"126 1","pages":"28-34"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85275900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Functional connectivity from EEG signals during perceiving pleasant and unpleasant odors","authors":"He Xu, E. Kroupi, T. Ebrahimi","doi":"10.1109/ACII.2015.7344683","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344683","url":null,"abstract":"The olfactory sense is strongly related to memory and emotional processes. Studies on the effects of odor perception from brain activity have been conducted by using different neuro-imaging techniques. In this paper, we analyse electroencephalography (EEG) of 23 subjects during perceiving pleasant and unpleasant odor stimuli. We describe the construction of brain functional connectivity networks measured by most commonly used models. We discuss the network-based features of functional connectivity, and design classifiers by applying different functional connectivity network features. Finally, we show that pleasant and unpleasant emotions from olfactory perceptions can be better classified if we see the brain as a nonlinear small-world network. By extracting appropriate features from functional connectivity networks, we manage to classify pleasant and unpleasant olfactory perceptions with an average Kappa value of 0.11 ± 0.17, which is significantly non-random.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"43 1","pages":"911-916"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85481339","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Griffin, G. Varni, G. Volpe, Gisela Tomé Lourido, M. Mancini, N. Bianchi-Berthouze
{"title":"Gesture mimicry in expression of laughter","authors":"H. Griffin, G. Varni, G. Volpe, Gisela Tomé Lourido, M. Mancini, N. Bianchi-Berthouze","doi":"10.1109/ACII.2015.7344642","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344642","url":null,"abstract":"Mimicry and laughter are two social signals displaying affiliation among people. To date, however, their relationship remains uninvestigated and relatively unexploited in designing the behaviour of robots and virtual characters. This paper presents an experiment aimed at examining how laughter and mimicry are related. The hypothesis is that hand movements a person produces during a laughter episode are mimicked through equivalent or other hand movements other participants in the interaction produce when they laugh. To investigate this, we analysed mimicry at two levels of specificity during laughter and non-laughter periods in a playful triadic social interaction. Changes in mimicry rates over the whole interaction were analysed as well as possible leader-follower relationships. Results show that hand movement rates were varied and strongly dependent on group. Even though hand movement are more frequent during laughter, mimicry does not increase. Mimicry levels, however, increase over the course of a session indicating that familiarity and comfort may increase emotional contagion.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"16 1","pages":"677-683"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90311395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Devillers, S. Rosset, G. D. Duplessis, M. A. Sehili, Lucile Bechade, Agnès Delaborde, Clément Gossart, Vincent Letard, Fan Yang, Y. Yemez, Bekir Berker Turker, T. M. Sezgin, Kevin El Haddad, S. Dupont, Daniel Luzzati, Y. Estève, E. Gilmartin, N. Campbell
{"title":"Multimodal data collection of human-robot humorous interactions in the Joker project","authors":"L. Devillers, S. Rosset, G. D. Duplessis, M. A. Sehili, Lucile Bechade, Agnès Delaborde, Clément Gossart, Vincent Letard, Fan Yang, Y. Yemez, Bekir Berker Turker, T. M. Sezgin, Kevin El Haddad, S. Dupont, Daniel Luzzati, Y. Estève, E. Gilmartin, N. Campbell","doi":"10.1109/ACII.2015.7344594","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344594","url":null,"abstract":"Thanks to a remarkably great ability to show amusement and engagement, laughter is one of the most important social markers in human interactions. Laughing together can actually help to set up a positive atmosphere and favors the creation of new relationships. This paper presents a data collection of social interaction dialogs involving humor between a human participant and a robot. In this work, interaction scenarios have been designed in order to study social markers such as laughter. They have been implemented within two automatic systems developed in the Joker project: a social dialog system using paralinguistic cues and a task-based dialog system using linguistic content. One of the major contributions of this work is to provide a context to study human laughter produced during a human-robot interaction. The collected data will be used to build a generic intelligent user interface which provides a multimodal dialog system with social communication skills including humor and other informal socially oriented behaviors. This system will emphasize the fusion of verbal and non-verbal channels for emotional and social behavior perception, interaction and generation capabilities.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"122 1","pages":"348-354"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90575113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Affect-expressive movement generation with factored conditional Restricted Boltzmann Machines","authors":"Omid Alemi, William Li, Philippe Pasquier","doi":"10.1109/ACII.2015.7344608","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344608","url":null,"abstract":"The expressivity of virtual, animated agents plays an important role in their believability. While the planning and goal-oriented aspects of agent movements have been addressed in the literature extensively, expressing the emotional state of the agents in their movements is an open research problem. We present our interactive animated agent model with controllable affective movements. We have recorded a corpus of affect-expressive motion capture data of two actors, performing various movements, and annotated based on their arousal and valence levels. We train a Factored, Conditional Restricted Boltzmann Machine (FCRBM) with this corpus in order to capture and control the valence and arousal qualities of movement patterns. The agents are then able to control the emotional qualities of their movements through the FCRBM for any given combination of the valence and arousal. Our results show that the model is capable of controlling the arousal level of the synthesized movements, and to some extent their valence, through manually defining the level of valence and arousal of the agent, as well as making transitions from one state to the other. We validate the expressive abilities of the model through conducting an experiment where participants were asked to rate their perceived affective state for both the generated and recorded movements.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"83 1","pages":"442-448"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89641984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigation of risk taking behavior and outcomes in decision making with modified BART (m-BART)","authors":"Kemal Taskin, D. Gökçay","doi":"10.1109/ACII.2015.7344587","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344587","url":null,"abstract":"Responses to risky choices were collected and analyzed in a continuous, engaging and decomposable risk taking task; a slightly modified version of BART (Balloon Analog Risk Task [1]). Pupil dilation data throughout the experiment were collected and analyzed to understand participants' physiological expressions under risky choices. Participants were also administered a survey, prior to the experiment to monitor individual risk taking attitudes. A thorough analysis of responses indicated a dynamic system consisting of risk taking or aversive states. Participants' pupil dilation rates were predictable from this dynamical model abstracted from consecutive responses. These findings may lead to a model that fuses affective and cognitive aspects within risky uncertain decisions. Natural risk tendencies, extracted from the survey had no statistically significant effect on the results.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"56 1","pages":"302-307"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91478446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Web questionnaire as construction method of affect-annotated lexicon - Risks reduction strategy","authors":"A. Landowska","doi":"10.1109/ACII.2015.7344605","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344605","url":null,"abstract":"The paper concerns credibility of construction methods for affect-annotated lexicons, specifically a web questionnaire is explored and evaluated. Web-based surveys are susceptible to some risks, which might influence credibility of the results, as some participants might perform random clicks or intentionally falsify the responses. The paper explores the risks and proposes some strategies to reduce them. The strategies are supported by their experimental evaluation in the real case of SentiD affect-annotated dictionary construction.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"84 1","pages":"421-427"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77392968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Recognizing emotion from singing and speaking using shared models","authors":"Biqiao Zhang, Georg Essl, E. Provost","doi":"10.1109/ACII.2015.7344563","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344563","url":null,"abstract":"Speech and song are two types of vocal communications that are closely related to each other. While significant progress has been made in both speech and music emotion recognition, few works have concentrated on building a shared emotion recognition model for both speech and song. In this paper, we propose three shared emotion recognition models for speech and song: a simple model, a single-task hierarchical model, and a multi-task hierarchical model. We study the commonalities and differences present in emotion expression across these two communication domains. We compare the performance across different settings, investigate the relationship between evaluator agreement rate and classification accuracy, and analyze the classification performance of individual feature groups. Our results show that the multi-task model classifies emotion more accurately compared to single-task models when the same set of features is used. This suggests that although spoken and sung emotion recognition tasks are different, they are related, and can be considered together. The results demonstrate that utterances with lower agreement rate and emotions with low activation benefit the most from multi-task learning. Visual features appear to be more similar across spoken and sung emotion expression, compared to acoustic features.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"45 1","pages":"139-145"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79252087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuliya Lutchyn, Paul Johns, M. Czerwinski, Shamsi T. Iqbal, G. Mark, A. Sano
{"title":"Stress is in the eye of the beholder","authors":"Yuliya Lutchyn, Paul Johns, M. Czerwinski, Shamsi T. Iqbal, G. Mark, A. Sano","doi":"10.1109/ACII.2015.7344560","DOIUrl":"https://doi.org/10.1109/ACII.2015.7344560","url":null,"abstract":"Despite a long history and a large volume of affective research, measuring affective states is still a non-trivial task that is complicated by numerous conceptual and methodological decisions that the researcher has to make. We suggest that inconsistent results reported in some areas of research can be partially explained by the choice of measurements that capture different manifestations of affective phenomena, or focus on different elements of affective processes. In the present study we examine one of such topics - a relationship between stress and individual's work role. In a 2-week, multi-method in situ study we collected affective information from 40 subjects. All participants provided continuous physiological (cardiovascular) data for the entire duration of the study, submitted multiple daily self-reports of momentary affect, and filled out a onetime assessment of the global perceived stress. We found that individuals' job role (specifically, decision-making workload) was not related to the cumulative measures of momentary affect, but was negatively correlated with the overall level of perceived stress. We further found that this negative relationship was partially mediated by individuals' coping behaviors. Our results emphasize the important difference between fleeting and global (appraised) affective states, and remind about intervening variables that can significantly modify affective processes. We suggest directions for future research and discuss practical applications for stress management.","PeriodicalId":6863,"journal":{"name":"2015 International Conference on Affective Computing and Intelligent Interaction (ACII)","volume":"4 1","pages":"119-124"},"PeriodicalIF":0.0,"publicationDate":"2015-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81538152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}