A. Abubshait, J. Pérez-Osorio, D. D. Tommaso, A. Wykowska
{"title":"Collaboratively framed interactions increase the adoption of intentional stance towards robots","authors":"A. Abubshait, J. Pérez-Osorio, D. D. Tommaso, A. Wykowska","doi":"10.1109/RO-MAN50785.2021.9515515","DOIUrl":null,"url":null,"abstract":"When humans interact with artificial agents, they adopt various stances towards them. On one side of the spectrum, people might adopt a mechanistic stance towards an agent and explain its behavior using its functional properties. On the other hand, people can adopt the intentional stance towards artificial agents and explain their behavior using mentalistic terms and explain the agents’ behavior using internal states (e.g., thoughts and feelings). While studies continue to investigate under which conditions people adopt the intentional stance towards artificial robots, here, we report a study in which we investigated the effect of social framing during a color-classification task with a humanoid robot, iCub. One group of participants were asked to complete the task with iCub, in collaboration, while the other group completed an identical task with iCub and were told that they were completing the task for themselves. Participants completed a task assessing their level of adoption of the Intentional Stance (the InStance test) prior to - and after completing the task. Results illustrate that participants who \"collaborated\" with iCub were more likely to adopt the intentional stance towards it after the interaction. These results suggest that social framing can be a powerful method to influence the stance that people adopt towards a robot.","PeriodicalId":6854,"journal":{"name":"2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN)","volume":"138 1","pages":"886-891"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RO-MAN50785.2021.9515515","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
When humans interact with artificial agents, they adopt various stances towards them. On one side of the spectrum, people might adopt a mechanistic stance towards an agent and explain its behavior using its functional properties. On the other hand, people can adopt the intentional stance towards artificial agents and explain their behavior using mentalistic terms and explain the agents’ behavior using internal states (e.g., thoughts and feelings). While studies continue to investigate under which conditions people adopt the intentional stance towards artificial robots, here, we report a study in which we investigated the effect of social framing during a color-classification task with a humanoid robot, iCub. One group of participants were asked to complete the task with iCub, in collaboration, while the other group completed an identical task with iCub and were told that they were completing the task for themselves. Participants completed a task assessing their level of adoption of the Intentional Stance (the InStance test) prior to - and after completing the task. Results illustrate that participants who "collaborated" with iCub were more likely to adopt the intentional stance towards it after the interaction. These results suggest that social framing can be a powerful method to influence the stance that people adopt towards a robot.