Akanksha Saran, Elaine Schaertl Short, A. Thomaz, S. Niekum
{"title":"Enhancing Robot Learning with Human Social Cues","authors":"Akanksha Saran, Elaine Schaertl Short, A. Thomaz, S. Niekum","doi":"10.1109/HRI.2019.8673178","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673178","url":null,"abstract":"Imagine a learning scenario between two humans: a teacher demonstrating how to play a new musical instrument or a craftsman teaching a new skill like pottery or knitting to a novice. Even though learning a skill has a learning curve to get the nuances of the technique right, some basic social principles are followed between the teacher and the student to make the learning process eventually succeed. There are several assumptions or social priors in this communication for teaching: mutual eye contact to draw attention to instructions, following the gaze of the teacher to understand the skill, the teacher following the student's gaze during imitation to give feedback, the teacher demonstrating by pointing towards something she is going to approach or manipulate and verbal interruptions or corrections during the learning process [1], [2]. In prior research, verbal and non-verbal social cues such as eye gaze and gestures have been shown to make human-human interactions seamless and augment verbal, collaborative behavior [3], [4]. They serve as an indicator of engagement, interest and attention when people interact face-to-face with one another [5], [6].","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"3 1","pages":"745-747"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75591344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, Dylan F. Glas, Iolanda Leite, K. Koay
{"title":"Personalization in Long-Term Human-Robot Interaction","authors":"Bahar Irfan, Aditi Ramachandran, Samuel Spaulding, Dylan F. Glas, Iolanda Leite, K. Koay","doi":"10.1109/HRI.2019.8673076","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673076","url":null,"abstract":"For practical reasons, most human-robot interaction (HRI) studies focus on short-term interactions between humans and robots. However, such studies do not capture the difficulty of sustaining engagement and interaction quality across long-term interactions. Many real-world robot applications will require repeated interactions and relationship-building over the long term, and personalization and adaptation to users will be necessary to maintain user engagement and to build rapport and trust between the user and the robot. This full-day workshop brings together perspectives from a variety of research areas, including companion robots, elderly care, and educational robots, in order to provide a forum for sharing and discussing innovations, experiences, works-in-progress, and best practices which address the challenges of personalization in long-term HRI.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"685-686"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73801809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Demonstrating a Framework for Rapid Development of Physically Situated Interactive Systems","authors":"Sean Andrist, D. Bohus, Ashley Feniello","doi":"10.1109/HRI.2019.8673067","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673067","url":null,"abstract":"We demonstrate an open, extensible framework for enabling faster development and study of physically situated interactive systems. The framework provides a programming model for parallel coordinated computation centered on temporal streams of data, a set of tools for data visualization and processing, and an open ecosystem of components. The demonstration showcases an interaction toolkit of components for systems that interact with people via natural language in the open world.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"4 1","pages":"668-668"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91139668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using Decision Support Systems for Juries in Court: Comparing the Use of Real and CG Robots","authors":"Yugo Hayashi, Kosuke Wakabayashi, Shigen Shimojyo, Yukoh Kida","doi":"10.1109/HRI.2019.8673298","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673298","url":null,"abstract":"In this report, we investigate the factor of social presence of a robot by using an actual robot and comparing it with a CG robot studied in our previous study. A laboratory experiment is conducted using a simple jury decision-making task, where participants play the role of a jury and make decisions regarding the length of the sentence for a particular crime. During the task, a robot with expert knowledge provides suggestions regarding the length of the sentence based on other similar cases. Results show that participants who engaged with an actual robot showed higher conformity with the suggested length of a sentence compared to the participants who engaged with a CG robot presented through a computer monitor. This study shows results that are consistent with those of previous studies in that interacting with physically aware robots is more engaging and also shows its effects on decision-making in a court.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"360 1","pages":"556-557"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76444985","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human Robot Interaction Using Diver Hand Signals","authors":"Robert Codd-Downey, M. Jenkin","doi":"10.1109/HRI.2019.8673133","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673133","url":null,"abstract":"Current methods for human robot interaction in the underwater domain seem antiquated in comparison to their terrestrial counterparts. Visual tags and custom built wired remotes are commonplace underwater, but such approaches have numerous drawbacks. Here we describe a method for human robot interaction underwater that borrows from the long standing history of diver communication using hand signals; a three stage approach for diver-robot communication using a series of neural networks.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"26 1","pages":"550-551"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74835273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Roboethics as a Research Puzzle","authors":"K. Zawieska, B. Vermuelen","doi":"10.1109/HRI.2019.8673271","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673271","url":null,"abstract":"This position paper discusses the question of incorporating roboethics into the roboticists' thinking about their research. On the one hand, there has been a growing recognition of the need to develop and advance the field of roboethics. On the other hand, for different reasons, a large part of the robotics community has still been reluctant to explicitly address ethical considerations in robotics research. We argue here that in order to facilitate and foster ethical reflection in roboticists' work, roboethics should be seen as a research puzzle. This implies studying rather than only applying specific ethical principles, as well as taking highly creative and pioneering approaches towards emerging ethical challenges.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"612-613"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80144431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Motion of Soft Robots with Physically Embodied Intelligence","authors":"Kyu-Jin Cho","doi":"10.1109/HRI.2019.8673158","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673158","url":null,"abstract":"Soft robotics deals with interaction with environments that are uncertain and vulnerable to change, by easily adapting to the environment with soft materials. However, softness inherently has large degrees of freedom which greatly complicates the motion generation. There has been no underlying principle for understanding the motion generated of soft robots. A big gap between rigid robots and soft robots has been that the kinematics of rigid robots can be defined using analytical methods, whereas the kinematics of soft robots were hard to be defined. Here, I suggest to use the minimum energy path to explain the kinematics of soft robots. The motion of soft robots follow the path where minimum energy that is required to create deformation. Hence, by plotting an energy map of a soft robot, we can estimate the motion of the soft robot and its reaction to external disturbances. Although it is extremely difficult to plot the energy map of a soft robot, this framework of using energy map to understand the motion of a soft robot can be a basis for unifying the method of explaining the motion generated by soft robots as well as rigid robots. A concept of physically embodied intelligence is a way to simplify the motion generate by soft robots by embodying intelligence into the design. Better performance can be achieved with a simpler actuation by using this concept. In nature, there are few example that exhibit this property. Flytrap, for example, can close its leaves quickly by using bistability of the leaves instead of just relying on the actuation. Inchworm achieves adaptive gripping with its prolegs by using the buckling effect. In this talk, I will give an overview of various soft robotic technologies, and some of the soft robots with physically embodied intelligence that are being developed at SNU Biorobotics Lab and Soft Robotics Research Center. These examples will show that the concept of physically embodied intelligence simplifies the design and enables better performance by exploiting the characteristics of the material and the minimum energy path concept can be a powerful tool to explain the motion generated by these robots.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"41 1","pages":"1-1"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86984055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gender Effects in Perceptions of Robots and Humans with Varying Emotional Intelligence","authors":"Meia Chita-Tegmark, Monika Lohani, Matthias Scheutz","doi":"10.1109/HRI.2019.8673222","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673222","url":null,"abstract":"Robots are machines and as such do not have gender. However, many of the gender-related perceptions and expectations formed in human-human interactions may be inadvertently and unreasonably transferred to interactions with social robots. In this paper, we investigate how gender effects in people's perception of robots and humans depend on their emotional intelligence (EI), a crucial component of successful human social interactions. Our results show that participants perceive different levels of EI in robots just as they do in humans. Also, their EI perceptions are affected by gender-related expectations both when judging humans and when judging robots with minimal gender markers, such as voice or even just a name. We discuss the implications for human-robot interactions (HRI) and propose further explorations of EI for future HRI studies.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"57 1","pages":"230-238"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84429308","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Micbot: A Peripheral Robotic Object to Shape Conversational Dynamics and Team Performance","authors":"Hamish Tennent, Solace Shen, Malte F. Jung","doi":"10.1109/HRI.2019.8673013","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673013","url":null,"abstract":"Many of the problems we face are solved in small groups. Using decades of research from psychology, HRI research is increasingly trying to understand how robots impact the dynamics and outcomes of these small groups. Current work almost exclusively uses humanoid robots that take on the role of an active group participant to influence interpersonal dynamics. We argue that this has limitations and propose an alternative design approach of using a peripheral robotic object. This paper presents Micbot, a peripheral robotic object designed to promote participant engagement and ultimately performance using nonverbal implicit interactions. The robot is evaluated in a 3 condition (no movement, engagement behaviour, random movement) laboratory experiment with 36 three-person groups $(mathbf{N}=108)$. Results showed that the robot was effective in promoting not only increased group engagement but also improved problem solving performance. In the engagement condition, participants displayed more even backchanneling toward one another, compared to no movement, but not to the random movement. This more even distribution of backchanneling predicted better problem solving performance.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"93 1","pages":"133-142"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82697244","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Bayesian Theory of Mind Approach to Nonverbal Communication","authors":"Jin Joo Lee, Fei Sha, C. Breazeal","doi":"10.1109/HRI.2019.8673023","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673023","url":null,"abstract":"This paper defines a dual computational framework to nonverbal communication for human-robot interactions. We use a Bayesian Theory of Mind approach to model dyadic storytelling interactions where the storyteller and the listener have distinct roles. The role of storytellers is to influence and infer the attentive state of listeners using speaker cues, and we computationally model this as a POMDP planning problem. The role of listeners is to convey attentiveness by influencing perceptions through listener responses, which we computational model as a DBN with a myopic policy. Through a comparison of state estimators trained on human-human interaction data, we validate our storyteller model by demonstrating how it outperforms current approaches to attention recognition. Then through a human-subjects experiment where children told stories to robots, we demonstrate that a social robot using our listener model more effectively communicates attention compared to alternative approaches based on signaling.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"25 1","pages":"487-496"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88420562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}