Int. J. Synth. Emot.最新文献

筛选
英文 中文
The Social Psychology of Dialogue Simulation as Applied in Elbot 对话模拟的社会心理学在Elbot中的应用
Int. J. Synth. Emot. Pub Date : 2014-07-01 DOI: 10.4018/ijse.2014070103
Frederic P. Roberts
{"title":"The Social Psychology of Dialogue Simulation as Applied in Elbot","authors":"Frederic P. Roberts","doi":"10.4018/ijse.2014070103","DOIUrl":"https://doi.org/10.4018/ijse.2014070103","url":null,"abstract":"Because of the high expectations users have on virtual assistants to interact with said systems on a human level, the rules of social interaction potentially apply and less the influence of emotion cues associated with the system responses. To this end the social psychological theories of control, reactance, schemata, and social comparison suggest strategies to transform the dialogue with a virtual assistant into an encounter with a consistent and cohesive personality, in effect using the mind-set of the user to the advantage of the conversation, provoking the user into reacting predictably while at the same time preserving the user's illusion of control. These methods are presented in an online system: Elbot.com.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116967889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Ekman's Paradox and a Naturalistic Strategy to Escape From It 埃克曼悖论及其逃避的自然主义策略
Int. J. Synth. Emot. Pub Date : 2013-07-01 DOI: 10.4018/ijse.2013070101
Jordi Vallverdú
{"title":"Ekman's Paradox and a Naturalistic Strategy to Escape From It","authors":"Jordi Vallverdú","doi":"10.4018/ijse.2013070101","DOIUrl":"https://doi.org/10.4018/ijse.2013070101","url":null,"abstract":"The purposes of this paper are two: first of all, to show that blind-following of a oversimplistic model of emotions like happens with Ekman's one is a bad situation for contemporary researchers from different disciplines. The author has called this situation, the Ekman's paradox; at the same time, the complexity and divergence of ideas, concepts, methodologies and evidences among emotion researchers makes difficult to obtain the necessary agreement to facilitate future researches. Consequently, and this is the second purpose of this text is to define an unique and very specific emotion, pain, as a fulcrum from which to start to define a clear map of emotions. Pain has been chosen due to its specific and unique hardwired body mechanisms as well as a universal agreement among experts about its primordiality. Changing a word to make this explicit, one can have a new start point for the understanding of emotions: dolet, ergo sum.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127098949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Empathy and Human-Machine Interaction 同理心和人机交互
Int. J. Synth. Emot. Pub Date : 2013-07-01 DOI: 10.4018/ijse.2013070102
Florence Gouvrit
{"title":"Empathy and Human-Machine Interaction","authors":"Florence Gouvrit","doi":"10.4018/ijse.2013070102","DOIUrl":"https://doi.org/10.4018/ijse.2013070102","url":null,"abstract":"This paper presents the framework of the author's practice and research exploring empathy and human-machine interaction in projects involving robotic art and video installations and performance. The works investigate emotions and embodiment, presence and absence, relationships and loss, and ways to implicate these ideas in encounters between technology-based artwork and the viewer.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123173426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Robot Pain 机器人的痛苦
Int. J. Synth. Emot. Pub Date : 2013-07-01 DOI: 10.4018/ijse.2013070103
S. V. Rysewyk
{"title":"Robot Pain","authors":"S. V. Rysewyk","doi":"10.4018/ijse.2013070103","DOIUrl":"https://doi.org/10.4018/ijse.2013070103","url":null,"abstract":"Functionalism of robot pain claims that what is definitive of robot pain is functional role, defined as the causal relations pain has to noxious stimuli, behavior and other subjective states. Here, the author proposes that the only way to theorize role-functionalism of robot pain is in terms of type-identity theory. The author argues that what makes a state pain for a neuro-robot at a time is the functional role it has in the robot at the time, and this state is type identical to a specific circuit state. Support from an experimental study shows that if the neural network that controls a robot includes a specific 'emotion circuit', physical damage to the robot will cause the disposition to avoid movement, thereby enhancing fitness, compared to robots without the circuit. Thus, pain for a robot at a time is type identical to a specific circuit state.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121633334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
The Compilation and Validation of a Collection of Emotional Expression Images Communicated by Synthetic and Human Faces 合成人脸与人脸情感表达图像集的编制与验证
Int. J. Synth. Emot. Pub Date : 2013-07-01 DOI: 10.4018/ijse.2013070104
Louise Lawrence, D. Nabi
{"title":"The Compilation and Validation of a Collection of Emotional Expression Images Communicated by Synthetic and Human Faces","authors":"Louise Lawrence, D. Nabi","doi":"10.4018/ijse.2013070104","DOIUrl":"https://doi.org/10.4018/ijse.2013070104","url":null,"abstract":"The BARTA Bolton Affect Recognition Tri-Stimulus Approach is a unique database comprising over 400 colour images of the universally recognised basic emotional expressions and is the first compilation to include three different classes of validated face stimuli; emoticon, computer-generated cartoon and photographs of human faces. The validated tri-stimulus collection all images received =70% inter-rater child and adult consensus has been developed to promote pioneering research into the differential effects of synthetic emotion representation on atypical emotion perception, processing and recognition in autism spectrum disorders ASD and, given the recent evidence for an ASD synthetic-face processing advantage Rosset et al., 2008, provides a means of investigating the benefits associated with the recruitment of synthetic face images in ASD emotion recognition training contexts.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114543690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Towards Natural Emotional Expression and Interaction: Development of Anthropomorphic Emotion Expression and Interaction Robots 面向自然情感表达与互动:拟人化情感表达与互动机器人的发展
Int. J. Synth. Emot. Pub Date : 2012-07-01 DOI: 10.4018/jse.2012070101
A. Takanishi, N. Endo, K. Petersen
{"title":"Towards Natural Emotional Expression and Interaction: Development of Anthropomorphic Emotion Expression and Interaction Robots","authors":"A. Takanishi, N. Endo, K. Petersen","doi":"10.4018/jse.2012070101","DOIUrl":"https://doi.org/10.4018/jse.2012070101","url":null,"abstract":"In present research the advanced fundamental mechanical capabilities of anthropomorphic robots developed in Takanishi laboratory at Waseda University are to be enhanced in order to enable these robots to interact with humans in a natural way. The anthropomorphic robot KOBIAN is able to express human-like facial expressions and whole-body gestures. It is equipped with vision and audio sensors that allow it to react to interaction input from human partners and to generate an appropriate emotional expression response. Furthermore, the anthropomorphic flute playing robot WF-4RVI is technically able to perform a musical wind-instrument performance at the level of an intermediate human player. Using this fundamental technical capability the authors implemented a musical-based interaction system MbIS that enables the robot to collaboratively play together with human musicians in a natural way. For both of the introduced interaction systems, the authors present and discuss the result of various experiments that were done to examine how well the interaction with a robot resembles realistic human-to-human interaction.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131086874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Synthetic Emotions for Humanoids: Perceptual Effects of Size and Number of Robot Platforms 类人机器人的合成情感:机器人平台大小和数量的感知效应
Int. J. Synth. Emot. Pub Date : 2012-07-01 DOI: 10.4018/jse.2012070104
David Grunberg, Alyssa M. Batula, Erik M. Schmidt, Youngmoo E. Kim
{"title":"Synthetic Emotions for Humanoids: Perceptual Effects of Size and Number of Robot Platforms","authors":"David Grunberg, Alyssa M. Batula, Erik M. Schmidt, Youngmoo E. Kim","doi":"10.4018/jse.2012070104","DOIUrl":"https://doi.org/10.4018/jse.2012070104","url":null,"abstract":"The recognition and display of synthetic emotions in humanoid robots is a critical attribute for facilitating natural human-robot interaction. The authors utilize an efficient algorithm to estimate the mood in acoustic music, and then use the results of that algorithm to drive movement generation systems to provide motions for the robot that are suitable for the music. This system is evaluated on multiple sets of humanoid robots to determine if the choice of robot platform or number of robots influences the perceived emotional content of the motions. Their tests verify that the authors' system can accurately identify the emotional content of acoustic music and produce motions that convey a similar emotion to that in the audio. They also determine the perceptual effects of using different sized or different numbers of robots in the motion performances.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"117 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127410197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Musical Robots and Interactive Multimodal Systems 音乐机器人和交互式多模态系统
Int. J. Synth. Emot. Pub Date : 2012-07-01 DOI: 10.4018/jse.2012070105
Angelica Lim
{"title":"Musical Robots and Interactive Multimodal Systems","authors":"Angelica Lim","doi":"10.4018/jse.2012070105","DOIUrl":"https://doi.org/10.4018/jse.2012070105","url":null,"abstract":"This volume describes the state-of-the-art in musical robots and interactive systems, and is divided into two sections: “Understanding Elements of Musical Performance and Expres-sion” and “Musical Robots and Automated Instruments.” These sections reflect the two main motivations for creating musical robots. The first reason is to further understand our-selves as humans by trying to recreate our mechanisms in algorithms and software. For example, what exactly makes a musical ges-ture expressive (Chapter 4)? Robots provide a controlled platform for scientific investigation to answer this question. The second motivation is to create more advanced musical robots for art, entertainment and education, and to develop platforms for the first goal. For instance, the flute-playing robot (Chapter 12) has been used as both a teaching robot and investigative plat-form for expressive play using vibrato. Topics include interfaces, human-robot interaction, synchronization, acoustic music processing, and automation.Emotion is an important element of mu-sic, and ‘emotion’ is mentioned in four of the fifteen chapters of this book. In this review, we highlight these projects and their contribution to the body of emotion research.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130415972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Volume Control by Adjusting Wrist Moment of Violin-Playing Robot 调节小提琴机器人手腕力矩的音量控制
Int. J. Synth. Emot. Pub Date : 2012-07-01 DOI: 10.4018/jse.2012070102
K. Shibuya, Hironori Ideguchi, Katsunari Ikushima
{"title":"Volume Control by Adjusting Wrist Moment of Violin-Playing Robot","authors":"K. Shibuya, Hironori Ideguchi, Katsunari Ikushima","doi":"10.4018/jse.2012070102","DOIUrl":"https://doi.org/10.4018/jse.2012070102","url":null,"abstract":"This paper introduces the details of the anthropomorphic violin-playing robot built in the authors' laboratory and an algorithm for controlling the sound volume by adjusting its wrist moment. Investigating the relationship between such sound parameters as sound volume and human impressions is an important research field in Kansei Engineering, which is a growing research field in Japan. They focused on the violin and built a violin-playing robot with two 7 DOF arms for bowing and fingering. Then they constructed an algorithm to adjust the wrist moment to control the sound volume. Based on the result of the experiments, the authors concluded that the moment-based algorithm works well to successfully control the sound volume. Finally, they analyzed the sound spectrum of the produced sounds of different wrist moment, and concluded that there is a possibility for the controlling sound spectrum, which affects human impressions.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"649 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132096988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Composing by Listening: A Computer-Assisted System for Creating Emotional Music 聆听作曲:一种用于创作情感音乐的计算机辅助系统
Int. J. Synth. Emot. Pub Date : 2012-07-01 DOI: 10.4018/jse.2012070103
L. Quinto, W. Thompson
{"title":"Composing by Listening: A Computer-Assisted System for Creating Emotional Music","authors":"L. Quinto, W. Thompson","doi":"10.4018/jse.2012070103","DOIUrl":"https://doi.org/10.4018/jse.2012070103","url":null,"abstract":"Most people communicate emotion through their voice, facial expressions, and gestures. However, it is assumed that only \"experts\" can communicate emotions in music. The authors have developed a computer-based system that enables musically untrained users to select relevant acoustic attributes to compose emotional melodies. Nonmusicians Experiment 1 and musicians Experiment 3 were progressively presented with pairs of melodies that each differed in an acoustic attribute e.g., intensity-loud vs. soft. For each pair, participants chose the melody that most strongly conveyed a target emotion anger, fear, happiness, sadness or tenderness. Once all decisions were made, a final melody containing all choices was generated. The system allowed both untrained and trained participants to compose a range of emotional melodies. New listeners successfully decoded the emotional melodies of nonmusicians Experiment 2 and musicians Experiment 4. Results indicate that human-computer interaction can facilitate the composition of emotional music by musically untrained and trained individuals.","PeriodicalId":272943,"journal":{"name":"Int. J. Synth. Emot.","volume":"4 6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131711870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信