Biologically Inspired Controller of Human Action Behaviour for a Humanoid Robot in a Dyadic Scenario

N. Duarte, M. Raković, J. Santos-Victor
{"title":"Biologically Inspired Controller of Human Action Behaviour for a Humanoid Robot in a Dyadic Scenario","authors":"N. Duarte, M. Raković, J. Santos-Victor","doi":"10.1109/EUROCON.2019.8861629","DOIUrl":null,"url":null,"abstract":"Humans have a particular way of moving their body when interacting with the environment and with other humans. The movement of the body is commonly known and expresses the intention of the action. The express of intent by our movement is classified as non-verbal cues, and from them, it is possible to understand and anticipate the actions of humans. In robotics, humans need to understand the intention of the robot in order to efficiently and safely interact in a dyadic activity. If robots could possess the same non-verbal cues when executing the same actions, then humans would be capable of interacting with robots the way they interact with other humans.We propose a robotic controller capable of executing actions of moving objects on a table (placing) and handover objects to humans (giving) in a human-like behaviour. Our first contribution is to model the behaviour of the non-verbal cues of a human interacting with other humans while performing placing and giving actions. From the recordings of the motion of the human, we build a computational model of the trajectory of the head, torso, and arm for the different actions. Additionally, the human motion model was consolidated with the integration of a previously developed human gaze behaviour model. As a second contribution, we embedded this model in the controller of an iCub humanoid robot and compared the generated trajectories to the real human model, and additionally, compare with the existing minimum-jerk controller for the iCub (iKin).Our results show that it is possible to model the complete upper body human behaviour during placing and giving interactions, and the generated trajectories from the model give a better approximation of the human-like behaviour in a humanoid robot than the existing inverse kinematics solver. From this work, we can conclude that our controller is capable of achieving a humanlike behaviour for the robot which is a step towards robots capable of understanding and being understood by humans.","PeriodicalId":232097,"journal":{"name":"IEEE EUROCON 2019 -18th International Conference on Smart Technologies","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE EUROCON 2019 -18th International Conference on Smart Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EUROCON.2019.8861629","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Humans have a particular way of moving their body when interacting with the environment and with other humans. The movement of the body is commonly known and expresses the intention of the action. The express of intent by our movement is classified as non-verbal cues, and from them, it is possible to understand and anticipate the actions of humans. In robotics, humans need to understand the intention of the robot in order to efficiently and safely interact in a dyadic activity. If robots could possess the same non-verbal cues when executing the same actions, then humans would be capable of interacting with robots the way they interact with other humans.We propose a robotic controller capable of executing actions of moving objects on a table (placing) and handover objects to humans (giving) in a human-like behaviour. Our first contribution is to model the behaviour of the non-verbal cues of a human interacting with other humans while performing placing and giving actions. From the recordings of the motion of the human, we build a computational model of the trajectory of the head, torso, and arm for the different actions. Additionally, the human motion model was consolidated with the integration of a previously developed human gaze behaviour model. As a second contribution, we embedded this model in the controller of an iCub humanoid robot and compared the generated trajectories to the real human model, and additionally, compare with the existing minimum-jerk controller for the iCub (iKin).Our results show that it is possible to model the complete upper body human behaviour during placing and giving interactions, and the generated trajectories from the model give a better approximation of the human-like behaviour in a humanoid robot than the existing inverse kinematics solver. From this work, we can conclude that our controller is capable of achieving a humanlike behaviour for the robot which is a step towards robots capable of understanding and being understood by humans.
双元场景下仿人机器人人类行为的生物启发控制器
当与环境和他人互动时,人类有一种特殊的方式来移动他们的身体。身体的动作是众所周知的,它表达了动作的意图。我们的动作所表达的意图被归类为非语言线索,通过它们,我们可以理解和预测人类的行为。在机器人技术中,人类需要了解机器人的意图,以便在二元活动中高效安全地进行交互。如果机器人在执行相同的动作时能够拥有相同的非语言提示,那么人类将能够以与其他人互动的方式与机器人互动。我们提出了一种机器人控制器,能够以类似人类的行为执行在桌子上移动物体(放置)和将物体移交给人类(给予)的动作。我们的第一个贡献是模拟人类在进行放置和给予动作时与其他人互动的非语言线索的行为。根据人体运动的记录,我们建立了头部、躯干和手臂的不同动作轨迹的计算模型。此外,人类运动模型与先前开发的人类凝视行为模型的整合得到了巩固。作为第二项贡献,我们将该模型嵌入到iCub人形机器人的控制器中,并将生成的轨迹与真实的人体模型进行比较,此外,还与现有的iCub最小抽搐控制器(iKin)进行比较。我们的研究结果表明,在放置和给予交互过程中,完整的上半身人类行为建模是可能的,并且从模型生成的轨迹比现有的逆运动学求解器更好地近似于类人机器人的类人行为。从这项工作中,我们可以得出结论,我们的控制器能够为机器人实现类似人类的行为,这是机器人能够理解和被人类理解的一步。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信