AFFINE '10最新文献

筛选
英文 中文
RANSAC-based training data selection for emotion recognition from spontaneous speech 基于ransac的自然语音情感识别训练数据选择
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877831
L. Erdem, Çiğdem Eroğlu, E. Bozkurt, E. Erzin
{"title":"RANSAC-based training data selection for emotion recognition from spontaneous speech","authors":"L. Erdem, Çiğdem Eroğlu, E. Bozkurt, E. Erzin","doi":"10.1145/1877826.1877831","DOIUrl":"https://doi.org/10.1145/1877826.1877831","url":null,"abstract":"Training datasets containing spontaneous emotional expressions are often imperfect due the ambiguities and difficulties of labeling such data by human observers. In this paper, we present a Random Sampling Consensus (RANSAC) based training approach for the problem of emotion recognition from spontaneous speech recordings. Our motivation is to insert a data cleaning process to the training phase of the Hidden Markov Models (HMMs) for the purpose of removing some suspicious instances of labels that may exist in the training dataset. Our experiments using HMMs with various number of states and Gaussian mixtures per state indicate that utilization of RANSAC in the training phase provides an improvement of up to 2.84% in the unweighted recall rates on the test set. . This improvement in the accuracy of the classifier is shown to be statistically significant using McNemar's test.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127713086","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
Enjoyment recognition from physiological data in a car racing game 从赛车游戏的生理数据中识别乐趣
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877830
Simone Tognetti, Maurizio Garbarino, Andrea Tommaso Bonanno, M. Matteucci, Andrea Bonarini
{"title":"Enjoyment recognition from physiological data in a car racing game","authors":"Simone Tognetti, Maurizio Garbarino, Andrea Tommaso Bonanno, M. Matteucci, Andrea Bonarini","doi":"10.1145/1877826.1877830","DOIUrl":"https://doi.org/10.1145/1877826.1877830","url":null,"abstract":"In this paper we present a case study on The Open Racing Car Simulator (TORCS) video game with the aim of developing a classifier to recognize user enjoyment from physiological signals. Three classes of enjoyment, derived from pairwise comparison of different races, are considered for classification; impact of artifact reduction, normalization and feature selection is studied; results from a protocol involving 75 gamers are discussed. The best model, obtained by taking into account a subset of features derived from physiological signals (selected by a genetic algorithm), is able to correctly classify 3 levels of enjoyment with a correct classification rate of 57%.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"51 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131093609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Designing affective computing learning companions with teachers as design partners. 设计以教师为设计伙伴的情感计算学习伙伴。
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877840
Sylvie Girard, H. Johnson
{"title":"Designing affective computing learning companions with teachers as design partners.","authors":"Sylvie Girard, H. Johnson","doi":"10.1145/1877826.1877840","DOIUrl":"https://doi.org/10.1145/1877826.1877840","url":null,"abstract":"There is a growing interest in studying the potential of including models of emotion in Embodied Pedagogical Agents (EPA), included in Computer-Assisted Learning (CAL) software. Children's understanding and response to emotions matures alongside their cognitive development. Any model of emotions embedded in an EPA will impact children's responses, and use of the EPA characters. Therefore EPA design should include both user characteristics and the pedagogical purposes behind the CAL system. This paper presents the participatory design of an EPA's affective responses to children's interaction with mathematical software. The participatory design sessions were performed with teachers as partners in designing the affective learning components. The results of a pilot study on children's responses to the emotional sequence defined in the model will be introduced, as a separate study. Finally, a plan of future research is presented to further validate the model's potential for CAL systems, including the integration two software learning applications constructed during the design sessions with teachers","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121068793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Ubiquitous social perception abilities for interaction initiation in human-robot interaction 人机交互中交互启动的泛在社会感知能力
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877841
A. Deshmukh, Ginevra Castellano, M. Lim, R. Aylett, P. McOwan
{"title":"Ubiquitous social perception abilities for interaction initiation in human-robot interaction","authors":"A. Deshmukh, Ginevra Castellano, M. Lim, R. Aylett, P. McOwan","doi":"10.1145/1877826.1877841","DOIUrl":"https://doi.org/10.1145/1877826.1877841","url":null,"abstract":"Robots acting as assistants or companions in a social environment must be capable of sensing information about the location of the users and analysing and interpreting their social, affective signals in order to be able to plan and generate an appropriate response. Social perception abilities are thereby very important for the robot to evaluate whether it is appropriate or not to initiate an interaction with the user. In this paper we present the initial steps of the design of a ubiquitous social perception system for interaction initiation: users' social signals and expressive behaviour are analysed at different spatial locations and temporal instants. We propose an approach to evaluate whether it is appropriate for a robot to initiate an interaction with the user. We describe an autonomous algorithm to regulate the inter-entity distance between the robot and a person using visual face detection which can be used during interaction initiation and also discuss the role of memory abilities to remember what has happened throughout the interaction.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"13 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113956387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Augmented photoware interfaces for affective human-human interactions 用于情感人际互动的增强照片界面
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877849
Radu-Daniel Vatavu
{"title":"Augmented photoware interfaces for affective human-human interactions","authors":"Radu-Daniel Vatavu","doi":"10.1145/1877826.1877849","DOIUrl":"https://doi.org/10.1145/1877826.1877849","url":null,"abstract":"Watching, sharing and discussing photographs represents an important social experience with profound emotional connotations. Although digital photoware allows multiple opportunities for indexing, retrieval and visualization of captured photographs, it cannot create the same affectionate, warm and emotionally-rich storytelling environment that tangible paper photos naturally induce. We describe a technique for enriching paper photographs with digital content in order to maintain the charming and desirable context of emotional photo storytelling but also to bring in new features that digital photoware does posses. A simple and easy-to-reproduce computer vision installation that employs visual markers is being described together with several interaction and visualization opportunities that digital creation brings to the traditional photo-talk. The affectionate interaction space of paper photography is being merged in a non intruding fashion with the more cold and impersonal but definitely more expressive digital space. The result translates into an interface that mixes tangible and virtual photoware in a context that preserves and endorses the affective nature of Human-Human interactions.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121666190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Interpretation of emotional body language displayed by robots 解读机器人的情感肢体语言
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877837
Aryel Beck, Antoine Hiolle, A. Mazel, L. Cañamero
{"title":"Interpretation of emotional body language displayed by robots","authors":"Aryel Beck, Antoine Hiolle, A. Mazel, L. Cañamero","doi":"10.1145/1877826.1877837","DOIUrl":"https://doi.org/10.1145/1877826.1877837","url":null,"abstract":"In order for robots to be socially accepted and generate empathy they must display emotions. For robots such as Nao, body language is the best medium available, as they do not have the ability to display facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should greatly improve its acceptance.\u0000 This research investigates the creation of an \"Affect Space\" [1] for the generation of emotional body language that could be displayed by robots. An Affect Space is generated by \"blending\" (i.e. interpolating between) different emotional expressions to create new ones. An Affect Space for body language based on the Circumplex Model of emotions [2] has been created.\u0000 The experiment reported in this paper investigated the perception of specific key poses from the Affect Space. The results suggest that this Affect Space for body expressions can be used to improve the expressiveness of humanoid robots.\u0000 In addition, early results of a pilot study are described. It revealed that the context helps human subjects improve their recognition rate during a human-robot imitation game, and in turn this recognition leads to better outcome of the interactions.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130346390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 69
Selecting appropriate agent responses based on non-content features 根据非内容特征选择适当的代理响应
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877836
M. Maat, D. Heylen
{"title":"Selecting appropriate agent responses based on non-content features","authors":"M. Maat, D. Heylen","doi":"10.1145/1877826.1877836","DOIUrl":"https://doi.org/10.1145/1877826.1877836","url":null,"abstract":"This paper describes work-in-progress on a study to create models of responses of virtual agents that are selected only based on non-content features, such as prosody and facial expressions. From a corpus of human-human interactions, in which one person was playing the part of an agent and the second person a user, we extracted the turns of the user and gave these to annotators. The annotators had to select utterances from a list of phrases in the repertoire of our agent that would be a good response to the user utterance. The corpus is used to train response selection models based on automatically extracted features and on human annotations of the user-turns.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"182 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120885213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A motivational health companion in the home as part of an intelligent health monitoring sensor network 作为智能健康监测传感器网络的一部分,家庭中的激励健康伴侣
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877842
V. Evers, Sabine Wildvuur, B. Kröse
{"title":"A motivational health companion in the home as part of an intelligent health monitoring sensor network","authors":"V. Evers, Sabine Wildvuur, B. Kröse","doi":"10.1145/1877826.1877842","DOIUrl":"https://doi.org/10.1145/1877826.1877842","url":null,"abstract":"This paper describes our work in progress to develop a personal monitoring system that can monitor the physical and emotional condition of a patient by using contextual information from a sensor network, provide the patient with feedback concerning their health status and motivate the patient to adopt behavior with a positive health impact (such as exercising or taking medication at the right moment). We will extend the capabilities of an existing robotic health buddy with a (DBN based) sensor network. Then we will carry out a series of controlled, long-term field experiments where we identify and evaluate the effects of various agent social communicative behaviors on the user's adoption of health improving lifestyle patterns. The findings of the experiments will inform the final design of the health buddy and its behaviors. We will also realize system adaptivity of the data processing and data fusion methods as well as the health buddy adaptivity to the user's emotional state. The project will limit itself to monitoring and motivating people who suffer from cardiovascular chronic conditions and to the home environment.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132530583","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Interpreting non-linguistic utterances by robots: studying the influence of physical appearance 机器人翻译非语言话语:研究外貌的影响
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877843
Robin Read, Tony Belpaeme
{"title":"Interpreting non-linguistic utterances by robots: studying the influence of physical appearance","authors":"Robin Read, Tony Belpaeme","doi":"10.1145/1877826.1877843","DOIUrl":"https://doi.org/10.1145/1877826.1877843","url":null,"abstract":"This paper presents a survey in which participants were asked to interpret non-linguistic utterances made by two different types of robot, one humanoid robot and one pet-like robot. The study set out to answer the question of whether the interpretation of emotions differed across types of robots, participant parameters and classes of utterance. We found that both male and female subjects were consistently more coherent in interpreting human over animal utterances, and animal over technological utterances. This held true with regard to the emotional and intentional interpretation, as well as the perception of appropriateness of a particular utterance with respect to a particular type of robot. We also found that males and females frequently differed significantly in their emotional and intentional interpretations of utterances. Finally, our results indicate that the morphology of a robot influences peoples judgment of what class of utterance is deemed appropriate for a particular type of robot.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116554230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Automated analysis of non-verbal affective and social behaviour 非语言情感和社会行为的自动分析
AFFINE '10 Pub Date : 2010-10-29 DOI: 10.1145/1877826.1877828
A. Camurri
{"title":"Automated analysis of non-verbal affective and social behaviour","authors":"A. Camurri","doi":"10.1145/1877826.1877828","DOIUrl":"https://doi.org/10.1145/1877826.1877828","url":null,"abstract":"This keynote introduces recent research on the automated analysis of non-verbal expressive gesture and of expressive social interaction in groups of users, for applications in novel multimodal interfaces and emerging User-Centric Media.","PeriodicalId":433717,"journal":{"name":"AFFINE '10","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128308554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信