A. S. Ghazali, Jaap Ham, P. Markopoulos, E. Barakova
{"title":"Investigating the Effect of Social Cues on Social Agency Judgement","authors":"A. S. Ghazali, Jaap Ham, P. Markopoulos, E. Barakova","doi":"10.1109/HRI.2019.8673266","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673266","url":null,"abstract":"To advance the research area of social robotics, it is important to understand the effect of different social cues on the perceived social agency to a robot. This paper evaluates three sets of verbal and nonverbal social cues (emotional intonation voice, facial expression and head movement) demonstrated by a social agent delivering several messages. A convenience sample of 18 participants interacted with SociBot, a robot that can demonstrate such cues, experienced in sequence seven sets of combinations of social cues. After each interaction, participants rated the robot's social agency (assessing its resemblance to a real person, and the extent to which they judged it to be like a living creature). As expected, adding social cues led to higher social agency judgments; especially facial expression was connected to higher social agency judgments.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"222 1","pages":"586-587"},"PeriodicalIF":0.0,"publicationDate":"2019-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76273713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robots in Need: Acquiring Assistance with Emotion","authors":"Joseph E. Daly, P. Bremner, U. Leonards","doi":"10.1109/HRI.2019.8673081","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673081","url":null,"abstract":"There will always be occasions where robots require assistance from humans. Understanding what motivates people to help a robot, and what effect this interaction has on an individual will be essential in successfully integrating robots into our society. Emotions are important in motivating prosocial behavior between people, and therefore may also play a large role in human-robot interaction. This research explores the role of emotion in motivating people to help a robot and some of the ethical issues that arise as a result, with the ultimate aim of developing suitable methods for robots to interact with humans to acquire assistance.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"46 1","pages":"706-708"},"PeriodicalIF":0.0,"publicationDate":"2019-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81273665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Robinette, M. Novitzky, Brittany A. Duncan, M. Jeon, Alan R. Wagner, C. Park
{"title":"Dangerous HRI: Testing Real-World Robots has Real-World Consequences","authors":"P. Robinette, M. Novitzky, Brittany A. Duncan, M. Jeon, Alan R. Wagner, C. Park","doi":"10.1109/HRI.2019.8673073","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673073","url":null,"abstract":"Robotic rescuers digging through rubble, fire-fighting drones flying over populated areas, robotic servers pouring hot coffee for you, and a nursing robot checking your vitals are all examples of current or near-future situations where humans and robots are expected to interact in a dangerous situation. Dangerous HRI is an as-yet understudied area of the field. We define dangerous HRI as situations where humans experience some amount of risk of bodily harm while interacting with robots. This interaction could take many forms, such as a bystander (e.g. when an autonomous car waits at a crossing for a pedestrian), as a recipient of robotic assistance (rescue robots), or as a teammate (like an autonomous robot working with a SWAT team). To facilitate better study of this area, the Dangerous HRI workshop brings together researchers who perform experiments with some risk of bodily harm to participants and discuss strategies for mitigating this risk while still maintaining validity of the experiment. This workshop does not aim to tackle the general problem of human safety around robots, but instead focused on guidelines for and experience from experimenters.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"687-688"},"PeriodicalIF":0.0,"publicationDate":"2019-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89711486","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Social Influence in HRI with Application to Social Robots for Rehabilitation","authors":"Katie Winkle","doi":"10.1109/HRI.2019.8673292","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673292","url":null,"abstract":"Social influence refers to an individual's attitudes and/or behaviours being influenced by others, whether implicit or explicit, such that persuasion and compliance gaining are instances of social influence [1] [2]. In human-human interaction (HHI), the desire to understand compliance and maximise social influence for persuasion has led to the development of theory and resulting strategies one can use in an attempt to leverage social influence, e.g. Cialdini's ‘Weapons of Influence’ [3]. Whilst a number of social human-robot interaction (HRI) studies have investigated the impact of different robot behaviours in compliance gaining/persuasion (e.g. [4]–[7]); established strategies for maximising this are yet to emerge, and it is unclear to what extent theories and strategies from HHI might apply.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"12 1","pages":"754-756"},"PeriodicalIF":0.0,"publicationDate":"2019-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79910436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bodily Human Robot Interaction","authors":"J. Solis, A. S. Sørensen, Gitte Rasmussen","doi":"10.1109/HRI.2019.8673132","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673132","url":null,"abstract":"This workshop is dedicated to discuss and explore the specific interdisciplinary aspects of Bodily human robot interaction, and establish a common ground for this area as a recognized and continued research topic. Bodily interaction with robots and robotic devices is partially established in niche applications such as exoskeletons, assistive devices and advanced machines for physical training where bodily interaction is the application. Bodily interaction is expected to develop a broader role in human robot interaction, for instance in manufacturing and in social and entertainment robotics. The direct exchange of force and motion in bodily interaction create a range of engineering challenges, but also entwine engineering directly with topics that traditionally reside in the realm of health and humanistic science, from biomechanics to human's social responses to the prompting and responses of physical interaction.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"73 1","pages":"683-684"},"PeriodicalIF":0.0,"publicationDate":"2019-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85744709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
V. Charisi, S. Šabanović, Serge Thill, E. Gómez, Keisuke Nakamura, R. Gomez
{"title":"Expressivity for Sustained Human-Robot Interaction","authors":"V. Charisi, S. Šabanović, Serge Thill, E. Gómez, Keisuke Nakamura, R. Gomez","doi":"10.1109/HRI.2019.8673268","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673268","url":null,"abstract":"Expressivity - the use of multiple, non-verbal, modalities to convey or augment the communication of internal states and intentions - is a core component of human social interactions. Studying expressivity in contexts of artificial agents has led to explicit considerations of how robots can leverage these abilities in sustained social interactions. Research on this covers aspects such as animation, robot design, mechanics, as well as cognitive science and developmental psychology. This workshop provides a forum for scientists from diverse disciplines to come together and advance the state of the art in developing expressive robots. Participants will discuss points of methodological opportunities and limitations, to develop a shared vision for next steps in expressive social robots.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"23 1","pages":"675-676"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74733014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Humanoid Therapy Robot for Encouraging Exercise in Dementia Patients","authors":"Mariah L. Schrum, C. Park, A. Howard","doi":"10.1109/HRI.2019.8673155","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673155","url":null,"abstract":"Dementia is a growing problem amongst elderly adults and the number of dementia patients is predicted to rise considerably in the coming years. While there is no cure for dementia, recent studies have suggested that exercise may have a positive effect on the cognitive function of dementia patients. We propose that a humanoid therapy robot is an effective tool for encouraging exercise in dementia patients. Such a robot will help address problems such as cost of care and shortage of healthcare workers. We have developed an interactive robotic system and conducted preliminary tests with a robot that encourages a user to engage in simple dance moves. The heart rate is used as feedback to decide which exercise move should be demonstrated. The results we have found are promising and we hope to continue this work via future studies.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"94 1","pages":"564-565"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82099050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Guangtao Zhang, J. P. Hansen, Katsumi Minakata, A. Alapetite, Zhongyu Wang
{"title":"Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities","authors":"Guangtao Zhang, J. P. Hansen, Katsumi Minakata, A. Alapetite, Zhongyu Wang","doi":"10.1109/HRI.2019.8673093","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673093","url":null,"abstract":"Eye-gaze interaction is a common control mode for people with limited mobility of their hands. Mobile robotic telepresence systems are increasingly used to promote social interaction between geographically dispersed people. We are interested in how gaze interaction can be applied to such robotic systems, in order to provide new opportunities for people with physical challenges. However, few studies have implemented gaze-interaction into a telepresence robot and it is still unclear how gaze-interaction within these robotic systems impacts users and how to improve the systems. This paper introduces our research project, which takes a two-phase approach towards investigating a novel interaction-system we developed. Results of these two studies are discussed and future plans are described.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"498 1","pages":"574-575"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76535170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Reality-Virtuality Interaction Cube: A Framework for Conceptualizing Mixed-Reality Interaction Design Elements for HRI","authors":"T. Williams, D. Szafir, T. Chakraborti","doi":"10.1109/HRI.2019.8673071","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673071","url":null,"abstract":"There has recently been an explosion of work in the human-robot interaction (HRI) community on the use of mixed, augmented, and virtual reality. We present a novel conceptual framework to characterize and cluster work in this new area and identify gaps for future research. We begin by introducing the Plane of Interaction: a framework for characterizing interactive technologies in a 2D space informed by the Model-View-Controller design pattern. We then describe how Interactive Design Elements that contribute to the interactivity of a technology can be characterized within this space and present a taxonomy of mixed-reality interactive design elements. We then discuss how these elements may be rendered onto both reality- and virtuality-based environments using a variety of hardware devices and introduce the Reality-Virtuality Interaction Cube: a three-dimensional continuum representing the design space of interactive technologies formed by combining the Plane of Interaction with the Reality-Virtuality Continuum. Finally, we demonstrate the feasibility and utility of this framework by clustering and analyzing the set of papers presented at the 2018 VAM-HRI workshop.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"46 1","pages":"520-521"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82651567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dario Mantegazza, Jérôme Guzzi, L. Gambardella, A. Giusti
{"title":"Learning Vision-Based Quadrotor Control in User Proximity","authors":"Dario Mantegazza, Jérôme Guzzi, L. Gambardella, A. Giusti","doi":"10.1109/HRI.2019.8673022","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673022","url":null,"abstract":"We consider a quadrotor equipped with a forward-facing camera, and an user freely moving in its proximity; we control the quadrotor in order to stay in front of the user, using only camera frames. To do so, we train a deep neural network to predict the drone controls given the camera image. Training data is acquired by running a simple hand-designed controller which relies on optical motion tracking data.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"118 1","pages":"369-369"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78578952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}