{"title":"Trust Estimation for Autonomous Vehicles by Measuring Pedestrian Behavior in VR","authors":"Ryota Masuda, Shintaro Ono, T. Hiraoka, Y. Suda","doi":"10.1145/3568294.3580072","DOIUrl":"https://doi.org/10.1145/3568294.3580072","url":null,"abstract":"This study proposes a method to estimate pedestrian trust in an automated vehicle (AV) based on pedestrian behavior. It conducted experiments in a VR environment where an AV approached a crosswalk. Participants rated their trust in the AV at three levels before/while they crossed the road. The level can be estimated by deep learning using their skeletal coordinates, position, vehicle position, and speed during the past four seconds. The estimation accuracy was 61%.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"325 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75047130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hey Robot, Can You Help Me Feel Less Lonely?: An Explorative Study to Examine the Potential of Using Social Robots to Alleviate Loneliness in Young Adults","authors":"Aike C. Horstmann","doi":"10.1145/3568294.3580135","DOIUrl":"https://doi.org/10.1145/3568294.3580135","url":null,"abstract":"An often-forgotten group of people which is heavily affected by loneliness are young adults. The perceived social isolation often stems from attachment insecurities and social skill deficiencies. Since robots can function as social interaction partners who exert less social pressure and display less social complexity, they may pose a promising approach to alleviate this problematic situation. The goal would not be to replace human interaction partners, but to diminish acute loneliness and accompanying detrimental effects and to function as social skills coach and practice interaction partner. To explore the potential of this approach, a preregistered quantitative online study (N = 150) incorporating a video-based interaction with a social robot and qualitative elements was conducted. First results show that young adults report less state loneliness after interacting with the robot than before. Technically affine people evaluate the robot's sociability as well as the interaction with it more positively, people with a general negative attitude towards robots less positively. Furthermore, the more trait loneliness people report to experience, the less sociable they perceive the robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"34 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75079642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of a University Guidance and Information Robot","authors":"A. Blair, M. Foster","doi":"10.1145/3568294.3580138","DOIUrl":"https://doi.org/10.1145/3568294.3580138","url":null,"abstract":"We are developing a social robot that will be deployed in a large, recently-built university building designed for learning and teaching. We outline the design process for this robot, which has included consultations with stakeholders including members of university services, students and other visitors to the building, as well as members of the \"Reach Out'' team who normally provide in-person support in the building. These consultations have resulted in a clear specification of the desired robot functionality, which will combine central helpdesk queries with local information about the building and the surrounding university campus. We outline the technical components that will be used to develop the robot system, and also describe how the success of the deployed robot will be evaluated.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"24 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82191215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Implications of AI Bias in HRI: Risks (and Opportunities) when Interacting with a Biased Robot","authors":"Tom Hitron, Noa Morag Yaar, H. Erel","doi":"10.1145/3568162.3576977","DOIUrl":"https://doi.org/10.1145/3568162.3576977","url":null,"abstract":"Social robotic behavior is commonly designed using AI algorithms which are trained on human behavioral data. This training process may result in robotic behaviors that echo human biases and stereotypes. In this work, we evaluated whether an interaction with a biased robotic object can increase participants' stereotypical thinking. In the study, a gender-biased robot moderated debates between two participants (man and woman) in three conditions: (1) The robot's behavior matched gender stereotypes (Pro-Man); (2) The robot's behavior countered gender stereotypes (Pro-Woman); (3) The robot's behavior did not reflect gender stereotypes and did not counter them (No-Preference). Quantitative and qualitative measures indicated that the interaction with the robot in the Pro-Man condition increased participants' stereotypical thinking. In the No-Preference condition, stereotypical thinking was also observed but to a lesser extent. In contrast, when the robot displayed counter-biased behavior in the Pro-Woman condition, stereotypical thinking was eliminated. Our findings suggest that HRI designers must be conscious of AI algorithmic biases, as interactions with biased robots can reinforce implicit stereotypical thinking and exacerbate existing biases in society. On the other hand, counter-biased robotic behavior can be leveraged to support present efforts to address the negative impact of stereotypical thinking.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"64 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82703349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effects of Predictive Robot Eyes on Trust and Task Performance in an Industrial Cooperation Task","authors":"L. Onnasch, Paul Schweidler, Maximilian Wieser","doi":"10.1145/3568294.3580123","DOIUrl":"https://doi.org/10.1145/3568294.3580123","url":null,"abstract":"Industrial cobots can perform variable action sequences. For human-robot interaction (HRI) this can have detrimental effects, as the robot's actions can be difficult to predict. In human interaction, eye gaze intuitively directs attention and communicates subsequent actions. Whether this mechanism can benefit HRI, too, is not well understood. This study investigated the impact of anthropomorphic eyes as directional cues in robot design. 42 participants worked on two subsequent tasks in an embodied HRI with a Sawyer robot. The study used a between-subject design and presented either anthropomorphic eyes, arrows or a black screen as control condition on the robot's display. Results showed that neither directional stimuli nor the anthropomorphic design in particular led to increased trust. But anthropomorphic robot eyes improved the prediction speed, whereas this effect could not be found for non-anthropomorphic cues (arrows). Anthropomorphic eyes therefore seem to be better suitable for an implementation on an industrial robot.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"1 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88185611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hoi Ki Tang, Matthijs H. J. Smakman, M. De Haas, Rianne van den Berghe
{"title":"L2 Vocabulary Learning Through Lexical Inferencing Stories With a Social Robot","authors":"Hoi Ki Tang, Matthijs H. J. Smakman, M. De Haas, Rianne van den Berghe","doi":"10.1145/3568294.3580140","DOIUrl":"https://doi.org/10.1145/3568294.3580140","url":null,"abstract":"Vocabulary is a crucial part of second language (L2) learning. Children learn new vocabulary by forming mental lexicon relations with their existing knowledge. This is called lexical inferencing: using the available clues and knowledge to guess the meaning of the unknown word. This study explored the potential of second language vocabulary acquisition through lexical inferencing in child-robot interaction. A storytelling robot read a book to Dutch kindergartners (N = 36, aged 4-6 years) in Dutch in which a few key words were translated into French (L2), and with a robot providing additional word explanation cues or not. The results showed that the children learned the key words successfully as a result of the reading session with the storytelling robot, but that there was no significant effect of additional word explanation cues by the robot. Overall, it seems promising that lexical inferencing can act as a new and different way to teach kindergartners a second language.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"53 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84558779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robots in Real Life: Putting HRI to Work","authors":"A. Thomaz","doi":"10.1145/3568162.3578810","DOIUrl":"https://doi.org/10.1145/3568162.3578810","url":null,"abstract":"This talk will be focused on the unique challenges in deploying a mobile manipulation robot into an environment where the robot is working closely with people on a daily basis. Diligent Robotics' first product, Moxi, is a mobile manipulation service robot that is at work in hospitals today assisting nurses and other front line staff with materials management tasks. This talk will dive into the computational complexity of developing a mobile manipulator with social intelligence. Dr. Thomaz will focus on how human-robot interaction theories and algorithms translate into the real-world and the impact on functionality and perception of robots that perform delivery tasks in a busy human environment. The talk will include many examples and data from the field, with commentary and discussion around both the expected and unexpected hard problems in building robots operating 24/7 as reliable teammates. BIO: Andrea Thomaz is the CEO and Co-Founder of Diligent Robotics. Her accolades include being recognized by the National Academy of Science as a Kavli Fellow, the US President's Council of Advisors on Science and Tech (PCAST), MIT Technology Review TR35 list, and TEDx as a featured keynote speaker on social robotics. Dr. Thomaz has received numerous research grants including the NSF CAREER award and the Office of Naval Research Young Investigator Award. Andrea has published in the areas of Artificial Intelligence, Robotics, and Human-Robot Interaction. Her research aims to computationally model mechanisms of human social learning and interaction, in order to build social robots and other machines that are intuitive for everyday people to teach. She earned her Ph.D. from MIT and B.S. in Electrical and Computer Engineering from UT Austin, and was a Robotics Professor at UT Austin and Georgia Tech (where she directed the Socially Intelligent Machines Lab). Andrea co-founded Diligent Robotics in 2018, to pursue her vision of creating socially intelligent robot assistants that collaborate with humans by doing their chores so humans can have more time for the work they care most about.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"4 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87684195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of a Wearable Robot that Moves on the Arm to Support the Daily Life of the User","authors":"Koji Kimura, F. Tanaka","doi":"10.1145/3568294.3579983","DOIUrl":"https://doi.org/10.1145/3568294.3579983","url":null,"abstract":"Wearable robots can maintain physical contact with the user and interact with them to assist in daily life. However, since most wearable robots operate at a single point on the user's body, the user must be constantly aware of their presence. This imposes a burden on the user, both physically and mentally, and prevents them from wearing the robot daily. One solution to this problem is for the robot to move around the user's body. When the user does not interact with the robot, it can move to an unobtrusive position and attract less attention from the user. This research aims to develop a wearable robot that reduces the burden by developing an arm movement mechanism for wearable robots and a self-localization method for autonomous movement and helps the user's daily life by providing supportive interactions.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"35 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89217416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Internet of Robotic Cat Toys to Deepen Bond and Elevate Mood","authors":"I. X. Han, Sarah Witzman","doi":"10.1145/3568294.3580183","DOIUrl":"https://doi.org/10.1145/3568294.3580183","url":null,"abstract":"Pets provide important mental support for human beings. Recent advancements in robotics and HRI have led to research and commercial products providing smart solutions to enrich indoor pets' lives. However, most of these products focus on satisfying pets' basic needs, such as feeding and litter cleaning, rather than their mental well-being. In this paper, we present the internet of robotic cat toys, where a group of robotic agents connects to play with our furry friends. Through three iterations, we demonstrate an affordable and flexible design of clip-on robotic agents to transform a static household into an interactive wonderland for pets.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"11 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87881234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Persuasive Robot that Alleviates Endogenous Smartphone-related Interruption","authors":"Hanyang Hu, Mengyu Chen, Ruhan Wang, Yijie Guo","doi":"10.1145/3568294.3580097","DOIUrl":"https://doi.org/10.1145/3568294.3580097","url":null,"abstract":"The endogenous interruptions of smartphones have impacted people's everyday life in many aspects, especially in the study and work scene under a lamp. To mitigate this, we make a robot that could persuade you intrinsically by augmenting the lamp on your desk with specific posture and light. This paper will present our design considerations and the first prototype to show the possibility of alleviating people's endogenous interruptions through robots.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"26 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75616299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}