{"title":"Pronunciation-Based Child-Robot Game Interactions to Promote Literacy Skills","authors":"Samuel Spaulding, C. Breazeal","doi":"10.1109/HRI.2019.8673296","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673296","url":null,"abstract":"In this paper we present additional results from a prior study of speech-based games to promote early literacy skills through child-robot interaction [6]. The additional data and results support our original conclusion, that pronunciation analysis software can be an effective enabler of speech child-robot interactions. We also include a comparison of other pronunciation services, an updated version of the SpeechAce API and a new technology from Soapbox Labs. We reflect on some lessons learned and introduce a redesigned version of the game interaction called ‘RhymeRacer’ based on the results and observations from both data collections.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"20 1","pages":"554-555"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74912417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jaclyn A. Barnes, S. M. Fakhrhosseini, Eric Vasey, Joseph D. Ryan, C. Park, M. Jeon
{"title":"Promoting STEAM Education with Child-Robot Musical Theater","authors":"Jaclyn A. Barnes, S. M. Fakhrhosseini, Eric Vasey, Joseph D. Ryan, C. Park, M. Jeon","doi":"10.1109/HRI.2019.8673311","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673311","url":null,"abstract":"In an eight-week STEAM education program for elementary school children, kids worked on musical theater projects with a variety of robots. The program included 4 modules about acting, dancing, music & sounds, and drawing. Twenty-five children grades K-5 participated in this program. Children were excited by the program and they demonstrated collaboration and peer-to-peer interactive learning. In the future, we plan to add more robust interaction and more science and engineering experiences to the program. This program is expected to promote STEM education in the informal learning environment by combining it with arts and design.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"13 1","pages":"366-366"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73133154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Meimei Zheng, Yingying She, Fang Liu, Jin Chen, Yang Shu, J. Xiahou
{"title":"BabeBay-A Companion Robot for Children Based on Multimodal Affective Computing","authors":"Meimei Zheng, Yingying She, Fang Liu, Jin Chen, Yang Shu, J. Xiahou","doi":"10.1109/HRI.2019.8673163","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673163","url":null,"abstract":"The BabeBay is a children companion robot which has the ability of real-time multimodal affective computing. Accurate and effective affective fusion computing makes BabeBay own adaptability and capability during interaction according to different children in different emotion. Furthermore, the corresponding cognitive computing and robots behavior can be enhanced to personalized companionship.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"604-605"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89592507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marlena R. Fraune, Steven Sherrin, S. Šabanović, Eliot R. Smith
{"title":"Is Human-Robot Interaction More Competitive Between Groups Than Between Individuals?","authors":"Marlena R. Fraune, Steven Sherrin, S. Šabanović, Eliot R. Smith","doi":"10.1109/HRI.2019.8673241","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673241","url":null,"abstract":"As robots, both individually and in groups, become more prevalent in everyday contexts (e.g., schools, workplaces, educational and caregiving institutions), it is possible that they will be perceived as outgroups, or come into competition for resources with humans. Research indicates that some of the psychological effects of intergroup interaction common in humans translate to human-robot interaction (HRI). In this paper, we examine how intergroup competition, like that among humans, translates to HRI. Specifically, we examined how Number of Humans (1, 3) and Number of Robots (1, 3) affect behavioral competition on dilemma tasks and survey ratings of perceived threat, emotion, and motivation (fear, greed, and outperformance). We also examined the effect of perceived group entitativity (i.e., cohesiveness) on competition motivation. Like in social psychological literature, these results indicate that groups of humans (especially entitative groups) showed more greed-based motivation and competition toward robots than individual humans did. However, we did not find evidence that number of robots had an effect on fear-based motivation or competition against them unless the robot groups were perceived as highly entitative. Our data also show the intriguing finding that participants displayed more fear of and competed slightly more against robots that matched their number. Future research should more deeply examine this novel pattern of results compared to one-on-one HRI and typical group dynamics in social psychology.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"11 1","pages":"104-113"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86563504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
W. Johal, A. Sandygulova, J. D. Wit, M. Haas, B. Scassellati
{"title":"Robots for Learning - R4L: Adaptive Learning","authors":"W. Johal, A. Sandygulova, J. D. Wit, M. Haas, B. Scassellati","doi":"10.1109/HRI.2019.8673109","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673109","url":null,"abstract":"The Robots for Learning workshop series aims at advancing the research topics related to the use of social robots in educational contexts. This year's half-day workshop follows on previous events in Human-Robot Interaction conferences focusing on efforts to design, develop and test new robotics systems that help learners. This 5th edition of the workshop will be dealing in particular on the potential use of robots for adaptive learning. Since the past few years, inclusive education have been a key policy in a number of countries, aiming to provide equal changes and common ground to all. In this workshop, we aim to discuss strategies to design robotics system able to adapt to the learners' abilities, to provide assistance and to demonstrate long-term learning effects.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"48 1","pages":"693-694"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79260312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
B. Gromov, Jérôme Guzzi, L. Gambardella, A. Giusti
{"title":"Demo: Pointing Gestures for Proximity Interaction","authors":"B. Gromov, Jérôme Guzzi, L. Gambardella, A. Giusti","doi":"10.1109/HRI.2019.8673329","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673329","url":null,"abstract":"We demonstrate a system to control robots in the users proximity with pointing gestures—a natural device that people use all the time to communicate with each other. Our setup consists of a miniature quadrotor Crazyflie 2.0, a wearable inertial measurement unit MetaWearR+ mounted on the user's wrist, and a laptop as the ground control station. The video of this demo is available at https://youtu.be/yafy-HZMk_U [1].","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"15 1","pages":"665-665"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91146151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"DataDrawingDroid: A Wheel Robot Drawing Planned Path as Data-Driven Generative Art","authors":"Yasuto Nakanishi","doi":"10.1109/HRI.2019.8673122","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673122","url":null,"abstract":"This paper introduces DataDrawingDroid, a wheel robot to visualize data and draw data-driven generative art onto a floor. In our user study, 24 participants watched videos of three types of data drawing. T-tests for the results on five-point Likert scales indicated that it attracted them and suggested the importance of balancing functionality and aesthetics.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"11 1","pages":"536-537"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72666464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Raquel Thiessen, Daniel J. Rea, Diljot S. Garcha, Cheng Cheng, J. Young
{"title":"Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive its Actions","authors":"Raquel Thiessen, Daniel J. Rea, Diljot S. Garcha, Cheng Cheng, J. Young","doi":"10.1109/HRI.2019.8673172","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673172","url":null,"abstract":"We investigate robots using infrasound, low-frequency vibrational energy at or near the human hearing threshold, as an interaction tool for working with people. Research in psychology suggests that the presence of infrasound can impact a person's emotional state and mood, even when the person is not acutely aware of the infrasound. Although often not noticed, infrasound is commonly present in many situations including factories, airports, or near motor vehicles. Further, a robot itself can produce infrasound. Thus, we examine if infrasound may impact how people interpret a robot's social communication: if the presence of infrasound makes a robot seem more or less happy, energetic, etc., as a result of impacting a person's mood. We present the results from a series of experiments that investigate how people rate a social robot's emotionally-charged gestures, and how varied levels and sources of infrasound impact these ratings. Our results show that infrasound does have a psychological effect on the person's perception of the robot's behaviors, supporting this as a technique that a robot can use as part of its interaction design toolkit. We further provide a comparison of infrasound generation methods.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"29 1","pages":"11-18"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81959010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hideki Garcia Goo, Jaime Alvarez Perez, Virginia Contreras
{"title":"An Antisocial Social Robot: Using Negative Affect to Reinforce Cooperation in Human-Robot Interactions","authors":"Hideki Garcia Goo, Jaime Alvarez Perez, Virginia Contreras","doi":"10.1109/HRI.2019.8673264","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673264","url":null,"abstract":"Inspired by prior work with robots that physically display positive emotion (e.g., [1]), we were interested to see how people might interact with a robot capable of communicating cues of negative affect such as anger. Based in particular on [2], we have prototyped an anti-social, zoomorphic robot equipped with a spike mechanism to nonverbally communicate anger. The robot's embodiment involves a simple dome-like morphology with a ring of inflatable spikes wrapped around its circumference. Ultrasonic sensors engage the robot's antisocial cuing (e.g., “spiking” when a person comes too close). To evaluate people's perceptions of the robot and the impact of the spike mechanism on their behavior, we plan to deploy the robot in social settings where it would be inappropriate for a person to approach (e.g., in front of a door with a “do not disturb” sign). We expect that exploration of robot antisociality, in addition to prosociality, will help inform the design of more socially complex human-robot interactions.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"29 1","pages":"763-764"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89345302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Face-to-Face Contact Method for Humanoid Robots Using Face Position Prediction","authors":"Yuki Okafuji, Jun Baba, Junya Nakanishi","doi":"10.1109/HRI.2019.8673175","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673175","url":null,"abstract":"It is an important functional behavior for humanoid robots to have face-to-face contact with humans. We predict future face position to achieve natural behavior that is similar to the communication between people. Robots gaze at a prediction point for reducing mechanical delay. The proposed system for robots to have face-to-face contact can reduce delay.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"77 1","pages":"666-666"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77404693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}