{"title":"When you are young, (robot's) looks matter. Developmental changes in the desired properties of a robot friend","authors":"A. Sciutti, F. Rea, G. Sandini","doi":"10.1109/ROMAN.2014.6926313","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926313","url":null,"abstract":"Seeing the world through the eyes of a child is always difficult. Designing a robot that might be liked and accepted by young users is therefore particularly complicated. We have investigated children's opinions on which features are most important in an interactive robot during a popular scientific event where we exhibited the iCub humanoid robot to a mixed public of various ages. From the observation of the participants' reactions to various robot demonstrations and from a dedicated ranking game, we found that children's requirements for a robot companion change sensibly with age. Before 9 years of age children give more relevance to a human-like appearance, while older kids and adults pay more attention to robot action skills. Additionally, the possibility to see and interact with a robot has an impact on children's judgments, especially convincing the youngest to consider also perceptual and motor abilities in a robot, rather than just its shape. These results suggest that robot design needs to take into account the different prior beliefs that children and adults might have when they see a robot with a human-like shape.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"477 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116688547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On human performance in tactile language learning and tactile memory","authors":"R. Velázquez, E. Pissaloux","doi":"10.1109/ROMAN.2014.6926236","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926236","url":null,"abstract":"This paper reports on the findings from an experiment on human performance in tactile language learning and tactile memory. A set of vibrotactile patterns representing verbal words was presented to a group of 20 voluntary subjects. Upon learning, subjects were capable of recognizing the patterns with high accuracy. Patterns were then combined with the aim of constructing sentences that gradually represent more complex ideas. Recognition rates remained satisfactory. A novel approach of tactile stimuli was explored: podotactile stimulation. For this study, a prototype of wearable electronic tactile display that stimulates the mechanoreceptors in the foot sole with vibrations was used. Results obtained suggest that it is possible to construct tactile languages that could be useful in human-computer interaction and wearable/mobile computing.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115067952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
W. Barnett, Adrienne Foos, Thorsten Gruber, D. Keeling, K. Keeling, L. Nasr
{"title":"Consumer perceptions of Interactive Service Robots: A Value-Dominant Logic perspective","authors":"W. Barnett, Adrienne Foos, Thorsten Gruber, D. Keeling, K. Keeling, L. Nasr","doi":"10.1109/ROMAN.2014.6926404","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926404","url":null,"abstract":"We propose a “Value-Dominant Logic” approach to complement HRI research by integrating two well-known user-centric methodologies from the field of marketing. From the results of laddering interviews accompanied by a visual projective technique we show that consumer value perceptions of robots in a retail service environment are of a paradoxical nature where behavioral and social norms are expected of the robot, yet not for the user. Our consumer oriented value-based approach can contribute to the field of HRI by providing a complementary means of user-centered design/ methodology/requirements gathering and additional multidisciplinary collaborations.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115548449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Experience of using a haptic interface to follow a robot without visual feedback","authors":"Ayan Ghosh, J. Penders, P. Jones, H. Reed","doi":"10.1109/ROMAN.2014.6926274","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926274","url":null,"abstract":"Search and rescue operations are often undertaken in smoke filled and noisy environments in which rescue teams must rely on haptic feedback for navigation and safe exit. In this paper, we discuss designing and evaluating a haptic interface to enable a human being to follow a robot through an environment with no-visibility. We first discuss the considerations that have led to our current interface design. The second part of the paper describes our testing procedure and the results of our first tests. Based on these results we discuss future improvements of our design.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114258741","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Negative Attitudes toward minimalistic Robots with intragroup communication styles","authors":"Marlena R. Fraune, S. Šabanović","doi":"10.1109/ROMAN.2014.6926401","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926401","url":null,"abstract":"With robots becoming more prevalent in daily life, it is important to understand human attitudes toward robots not only when humans interact with them directly, as most research examines, but also when people are indirectly exposed to robots performing nonsocial tasks (e.g., cleaning) in their vicinity. Because minimalistic robots are at present more likely to be found in households than human-like robots, this study examined human reactions to nonsocial, nonanthropomorphic robots. The specific focus of this study was on how robot communication style during human-robot co-location affects human perceptions of a group of robots. This paper also evaluates the relationship between participants' scores on the Negative Attitudes toward Robots Scale (NARS) and their behavioral response to and perceptions of robots in their environment. Our results suggest that robot communication style did not affect perceptions of robots and that responses on the NARS may not translate directly to behavior toward robots.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122117617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards companion robots behaving with style","authors":"W. Johal, S. Pesty, Gaëlle Calvary","doi":"10.1109/ROMAN.2014.6926393","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926393","url":null,"abstract":"Sociability of companion robots is one of the challenges that the field of human-robot interaction faces. Inspired from research in psychology and sociology dealing with inter-personal relationships, we aim to render robots capable of a behaviour compatible to be among humans. In the context of a companion robot for children, we propose different parenting styles (namely authoritative and permissive) and evaluate their effectiveness and acceptability by parents. We implemented behaviours of different styles played out by two robots, Nao and Reeti, with body and facial channels respectively for communication. 94 parents watched videos of the robots and replied to a questionnaire about the authoritativeness, effectiveness and acceptability of the robots. The results showed that robots can be perceived as dominant and authoritative; however their effectiveness as an authoritative figure is limited to young children and is correlated to the style played when giving an order. When given a choice between authoritative and permissive styles, the parents ended up not always choosing a parenting style similar to their own. This work contributes in formalising context dependent personalisation to parent expectation of a companion robot for children using the concept of styles.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129077617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robotic gaming companion to facilitate social interaction among children","authors":"Junya Hirose, Masakazu Hirokawa, Kenji Suzuki","doi":"10.1109/ROMAN.2014.6926231","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926231","url":null,"abstract":"This study proposes a gaming companion robot to facilitate social interaction among children. Games have been one popular social communication tool for people, and we propose a novel approach to gaming communication using a robot. We have implemented a robot that is capable of playing a video game to facilitate social interaction in a game environment. We used autonomous control to allow the robot to play during the videogame, and the Wizard of OZ (WOZ) framework for interaction between human and robot before and after the game. The results show that the robot was able to play the game in a manner similar to how a human plays the game. The preliminary study with children showed that this new approach was useful to facilitate people's interactions. We describe and evaluate the robotic system in this paper.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124247815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Online adaptation of dialog strategies based on probabilistic planning","authors":"Steffen Müller, Sina Sprenger, H. Groß","doi":"10.1109/ROMAN.2014.6926333","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926333","url":null,"abstract":"In this paper, a dialog modeling approach for long-term interaction between a service robot and a single user is presented, which enables a user-adaptive interaction behavior of the robot. Central element of the dialog system is a probabilistic model of the user's reactions to the robot's behavior, which is learned online and used for a probabilistic planning process based on message passing in a dynamic factor graph. The suggested approach has been applied to implement a complex application on a mobile service robot, which has been tested in a 10 day evaluation study with 16 users in order to get a feedback on usability of the interaction design, adaptation skills, and feasibility of a rapid application development. Results and findings of that study are presented here briefly.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114688973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. M. Khan, Deok-won Yun, Jung-Soo Han, K. Shin, Chang-Soo Han
{"title":"Upper extremity assist exoskeleton robot","authors":"A. M. Khan, Deok-won Yun, Jung-Soo Han, K. Shin, Chang-Soo Han","doi":"10.1109/ROMAN.2014.6926366","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926366","url":null,"abstract":"Need to develop human body's posture supervised robots, gave the push to researchers to think over dexterous design of exoskeleton robots. It requires to develop quantitative techniques to assess motor function and generate the command for the robots to act accordingly with complex human structure. In this paper, we focus on developing new technique for the upper limb power exoskeleton in which load is handled by the human subject and not by the robot. Main challenge along with the design complexity is to find the desired human motion intention and to develop an algorithm to assist as needed accordingly. For this purpose, we used newly developed Muscle Circumference Sensor (MCS) instead of electromyogram (EMG) sensors. MCS together with the load cells is used to estimate the desired human intention by which desired trajectory is generated. The desired trajectory is then tracked by passivity based adaptive control technique. Developed Upper limb power exoskeleton has seven degrees of freedom (DOF) in which five are passive and two are active. Active joints include shoulder and elbow, powered by electric motors and move in Sagittal plane while abduction and adduction motion in shoulder joint is provided by the passive joint. Performance of the exoskeleton is evaluated experimentally by a neurologically intact subject. The results show that after adjusting the motion intention recognition algorithm for the subject, the robot assisted effectively and the subject only felt nominal load regardless of the weight in hand.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126509220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Accessible interfaces for robot assistants","authors":"Daniel A. Lazewatsky, W. Smart","doi":"10.1109/ROMAN.2014.6926238","DOIUrl":"https://doi.org/10.1109/ROMAN.2014.6926238","url":null,"abstract":"Currently, high-level task control of robots is generally performed by using a graphical interface on a desktop or laptop computer. This type of mediated interaction is not natural, and can be problematic and cumbersome for persons with certain types of motor disabilities, and for people interacting with the robot when there are no computer displays present. In this work, we present a framework which enables the removal of such obvious intermediary devices and allows users to assign tasks to robots using interfaces embedded directly in the world, by projecting these interfaces directly onto surfaces and objects. We describe the implementation of the projected interface framework, and give several examples of tasks which can be performed with such an interface.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"21 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130803568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}