人机交互:利用机器人动态肢体动作识别相似情感

Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed
{"title":"人机交互:利用机器人动态肢体动作识别相似情感","authors":"Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed","doi":"10.1109/ICAI58407.2023.10136649","DOIUrl":null,"url":null,"abstract":"Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.","PeriodicalId":161809,"journal":{"name":"2023 3rd International Conference on Artificial Intelligence (ICAI)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human Robot Interaction: Identifying Resembling Emotions Using Dynamic Body Gestures of Robot\",\"authors\":\"Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed\",\"doi\":\"10.1109/ICAI58407.2023.10136649\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.\",\"PeriodicalId\":161809,\"journal\":{\"name\":\"2023 3rd International Conference on Artificial Intelligence (ICAI)\",\"volume\":\"116 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 3rd International Conference on Artificial Intelligence (ICAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAI58407.2023.10136649\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 3rd International Conference on Artificial Intelligence (ICAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAI58407.2023.10136649","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

人与机器人之间的自然社会互动要求机器人具有对复杂社会行为的感知能力,并在互动过程中表现出习得的情感行为。肢体语言表达的情感是人机交互的关键参数之一。本研究的重点是扩展情感非语言人机交互的概念,使用全身手势来接近情感,而没有复杂的面部表情。头部、躯干、腿部和手臂的运动,将双足人形机器人的身体运动和手势与28种相似的情绪联系起来。机器人的速度、频率和关节角度的变化是表达特定情感的关键特征。本研究使用罗素的循环模型定义了8个主要的情绪类别,这些类别进一步分组为彼此相似的密切相关的情绪。33名参与者参与了利用机器人肢体动作验证不同情绪的实验。每个参与者都进行了28次随机显示情绪的试验,以识别机器人的状态。结果显示平均准确率为79.69%。该研究证实,感知到的情绪高度依赖于身体动作、姿势和选定的特征。因此,这种设计模型甚至可以用于传达非常相似的情感。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Human Robot Interaction: Identifying Resembling Emotions Using Dynamic Body Gestures of Robot
Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信