群体间偏见对类人机器人信任和接近行为的影响

Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens
{"title":"群体间偏见对类人机器人信任和接近行为的影响","authors":"Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens","doi":"10.5898/JHRI.6.3.Deligianis","DOIUrl":null,"url":null,"abstract":"As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the \"minimal group paradigm\" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a \"robot\" or \"computer\" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the \"robot group\" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their \"computer group\" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the \"robot group\" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.","PeriodicalId":92076,"journal":{"name":"Journal of human-robot interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":"{\"title\":\"The impact of intergroup bias on trust and approach behaviour towards a humanoid robot\",\"authors\":\"Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens\",\"doi\":\"10.5898/JHRI.6.3.Deligianis\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the \\\"minimal group paradigm\\\" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a \\\"robot\\\" or \\\"computer\\\" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the \\\"robot group\\\" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their \\\"computer group\\\" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the \\\"robot group\\\" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.\",\"PeriodicalId\":92076,\"journal\":{\"name\":\"Journal of human-robot interaction\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"22\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of human-robot interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5898/JHRI.6.3.Deligianis\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of human-robot interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5898/JHRI.6.3.Deligianis","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

摘要

随着机器人的普及,以及成功的人机交互,人们将需要信任它们。使用“最小群体范式”进行了两个实验,以探索社会身份理论是否影响机器人的信任形成和印象。在实验1中,参与者被分配到“机器人”或“计算机”组,然后他们与Aldebaran Nao人形机器人作为伙伴进行合作视觉跟踪游戏。我们假设“机器人组”的参与者会表现出组间偏见,因为他们坐得离机器人更近(接近),并且比“计算机组”的同伴更频繁地相信机器人的建议答案。实验2对不同的参与者使用了几乎相同的程序;然而,所有参与者都被分配到“机器人组”,并操纵三个不同级别的拟人化机器人运动。我们的研究结果表明,群体间的偏见和类人运动会显著影响人类-机器人的接近行为。在任务难度方面,信任机器人的建议答案会产生显著影响,但对小组成员或机器人运动没有影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The impact of intergroup bias on trust and approach behaviour towards a humanoid robot
As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the "minimal group paradigm" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a "robot" or "computer" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the "robot group" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their "computer group" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the "robot group" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信