2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005284
S. Chernova, N. DePalma, Elisabeth Morant, C. Breazeal
{"title":"Crowdsourcing human-robot interaction: Application from virtual to physical worlds","authors":"S. Chernova, N. DePalma, Elisabeth Morant, C. Breazeal","doi":"10.1109/ROMAN.2011.6005284","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005284","url":null,"abstract":"The ability for robots to engage in interactive behavior with a broad range of people is critical for future development of social robotic applications. In this paper, we propose the use of online games as a means of generating large-scale data corpora for human-robot interaction research in order to create robust and diverse interaction models. We describe a data collection approach based on a multiplayer game that was used to collect movement, action and dialog data from hundreds of online users. We then study how these records of human-human interaction collected in a virtual world can be used to generate contextually correct social and task-oriented behaviors for a robot collaborating with a human in a similar real-world environment. We evaluate the resulting behavior model using a physical robot in the Boston Museum of Science, and show that the robot successfully performs the collaborative task and that its behavior is strongly influenced by patterns in the crowdsourced dataset.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123430875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005289
Kazuki Kobayashi, Kotaro Funakoshi, S. Yamada, Mikio Nakano, T. Komatsu, Yasunori Saito
{"title":"Blinking light patterns as artificial subtle expressions in human-robot speech interaction","authors":"Kazuki Kobayashi, Kotaro Funakoshi, S. Yamada, Mikio Nakano, T. Komatsu, Yasunori Saito","doi":"10.1109/ROMAN.2011.6005289","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005289","url":null,"abstract":"Users' impressions of blinking light expressions used as artificial subtle expressions have been investigated. In a preliminary experiment, thirteen blinking patterns were used for investigating participants' impressions of their agreeableness. The highest and lowest valued blinking patterns were identified and used for a speech interaction experiment. In this experiment, 52 participants tried to reserve hotel rooms with a spoken dialogue system coupled with an interface robot using a blinking light expression. A sine wave, a random wave, a rectangular wave, and a no-blinking condition were used as artificial subtle expressions to express a robot's internal state of “processing” or “recognizing”. The results of a questionnaire showed the conditions did not significantly differ in terms of agreeableness, but the sine wave and the rectangular wave were evaluated as “more useful” than the no-blinking condition. Results of factor analyses suggested that the rectangular wave provides a comfortable impression of the dialogue.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121756260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005267
A. Salerno, Fabio Viziano, F. Mastrogiovanni, A. Sgorbissa, R. Zaccaria
{"title":"Composition of behaviour primitives for entertainment humanoid robots","authors":"A. Salerno, Fabio Viziano, F. Mastrogiovanni, A. Sgorbissa, R. Zaccaria","doi":"10.1109/ROMAN.2011.6005267","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005267","url":null,"abstract":"This article introduces a model for representing motion primitives for entertainment humanoid robots using Generalized Hierarchical AND/OR graphs. On the one hand, the goal is to drive robots with scripts, as if they were on a stage. On the other hand, the approach allows for storing a minimum amount of behaviours, thereby reducing on-board memory and computational requirements. Standard ontology-based reasoning mechanisms are used to operate on such a representation, in a fully hierarchical fashion. Experimental validation has been assessed using toy Kondo robots.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124574261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005277
Jaeeun Shim, A. Thomaz
{"title":"Human-like action segmentation for option learning","authors":"Jaeeun Shim, A. Thomaz","doi":"10.1109/ROMAN.2011.6005277","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005277","url":null,"abstract":"Robots learning interactively with a human partner has several open questions, one of which is increasing the efficiency of learning. One approach to this problem in the Reinforcement Learning domain is to use options, temporally extended actions, instead of primitive actions. In this paper, we aim to develop a robot system that can discriminate meaningful options from observations of human use of low-level primitive actions. Our approach is inspired by psychological findings about human action parsing, which posits that we attend to low-level statistical regularities to determine action boundary choices. We implement a human-like action segmentation system for automatic option discovery and evaluate our approach and show that option-based learning converges to the optimal solutions faster compared with primitive-action-based learning.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"58 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120934226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005196
R. D. Nijs, Miguel Juliá, N. Mitsou, Barbara Gonsior, D. Wollherr, K. Kühnlenz, M. Buss
{"title":"Following route graphs in urban environments","authors":"R. D. Nijs, Miguel Juliá, N. Mitsou, Barbara Gonsior, D. Wollherr, K. Kühnlenz, M. Buss","doi":"10.1109/ROMAN.2011.6005196","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005196","url":null,"abstract":"In this paper, an approach is presented that allows a robot to navigate in an urban environment by following natural language route instructions. In this situation, neither maps nor GPS information are available to the robot thus it has to rely solely on the human-given route description and the observations from its sensors. An architecture for solving problems such as navigation on the sidewalk, street direction inference, and environment labeling that arise in this situation is presented. Our initial experiments indicate that the proposed methods enable a robot to safely navigate in urban environments by following abstract route descriptions and reach previously unknown points in a city.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123797507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005195
M. Bergh, Daniel Carton, R. D. Nijs, N. Mitsou, Christian Landsiedel, K. Kühnlenz, D. Wollherr, L. Gool, M. Buss
{"title":"Real-time 3D hand gesture interaction with a robot for understanding directions from humans","authors":"M. Bergh, Daniel Carton, R. D. Nijs, N. Mitsou, Christian Landsiedel, K. Kühnlenz, D. Wollherr, L. Gool, M. Buss","doi":"10.1109/ROMAN.2011.6005195","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005195","url":null,"abstract":"This paper implements a real-time hand gesture recognition algorithm based on the inexpensive Kinect sensor. The use of a depth sensor allows for complex 3D gestures where the system is robust to disturbing objects or persons in the background. A Haarlet-based hand gesture recognition system is implemented to detect hand gestures in any orientation, and more in particular pointing gestures while extracting the 3D pointing direction. The system is integrated on an interactive robot (based on ROS), allowing for real-time hand gesture interaction with the robot. Pointing gestures are translated into goals for the robot, telling him where to go. A demo scenario is presented where the robot looks for persons to interact with, asks for directions, and then detects a 3D pointing direction. The robot then explores his vicinity in the given direction and looks for a new person to interact with.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128236895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005238
C. Avizzano, Massimo Satler, G. Cappiello, Andrea Scoglio, E. Ruffaldi, M. Bergamasco
{"title":"MOTORE: A mobile haptic interface for neuro-rehabilitation","authors":"C. Avizzano, Massimo Satler, G. Cappiello, Andrea Scoglio, E. Ruffaldi, M. Bergamasco","doi":"10.1109/ROMAN.2011.6005238","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005238","url":null,"abstract":"The present paper introduces a novel type of haptic interface which is fully portable and employs only onboard sensors and electronics to solve accurate localization and force feedback generation. The device offers 2DOF force control while sliding on a plane and maintaining its orientation comfortable for the user. The device generates force feedback information without any intermediate link between the motion wheels and the grasping handle. The device has been designed for application in neuro-rehabilitation protocols and it adopts specific mechanical, electrical and control solutions in order to cope with patient requirements. The paper describes the mechanical and electronic solutions as well as the most relevant features of control implementation issues that were addressed in the system design.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128901921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005290
Yukitaka Kawaguchi, K. Wada, M. Okamoto, T. Tsujii, T. Shibata, K. Sakatani
{"title":"Investigation of brain activity during interaction with seal robot by fNIRS","authors":"Yukitaka Kawaguchi, K. Wada, M. Okamoto, T. Tsujii, T. Shibata, K. Sakatani","doi":"10.1109/ROMAN.2011.6005290","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005290","url":null,"abstract":"Brain activity during interaction with seal robot “Paro” was measured by functional near-infrared spectroscopy (fNIRS). In the pilot study, there were two experimental tasks: Interaction with Paro that was switched ON (Paro ON) and Paro that was switched OFF (Paro OFF) with one minute rest before and after the tasks. The results showed significant activations around the pre-motor area and the supplementary motor area (SMA) during Paro OFF task compared with rest condition before the task. These areas are supposed to be a function of initiating action. This indicates that when participants touched and interacted with Paro, they might intentionally interact. On the other hand, there were significant increased brain activations on both sides of the Sylvian fissure during Paro ON task compared with rest condition before the task. The results suggest that when participants interacted with Paro, they might recognize the emotional gesture expression of Paro. They might communicate with Paro naturally without intention of interaction.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128769368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005244
S. Costa, F. Soares, C. Santos, Manuel J. Ferreira, M. F. Moreira, Ana Paula Pereira Vieira, Fernanda Cunha
{"title":"An approach to promote social and communication behaviors in children with autism spectrum disorders: Robot based intervention","authors":"S. Costa, F. Soares, C. Santos, Manuel J. Ferreira, M. F. Moreira, Ana Paula Pereira Vieira, Fernanda Cunha","doi":"10.1109/ROMAN.2011.6005244","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005244","url":null,"abstract":"Most autistic people present some difficulties in developing social behavior, living in their own world. This study has the goal to improve the social life of children with autism with a main focus in promoting their social interaction and communication. It is necessary to call for children's attention and enforce their collaboration, where a robot, LEGO MindStorm, behaves as a mediator/promoter of this interaction. A set of experiments designed to share objects and fulfill simple orders, by the 11 years old autistic child at the time of daily routine work and in-game with the robot, are described. The generalization of the acquired skills by the child in new contexts and environments are also tested. Results are described showing the outcomes of the experiments.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126085548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
2011 RO-MANPub Date : 2011-08-30DOI: 10.1109/ROMAN.2011.6005285
Maha Salem, K. Rohlfing, S. Kopp, F. Joublin
{"title":"A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction","authors":"Maha Salem, K. Rohlfing, S. Kopp, F. Joublin","doi":"10.1109/ROMAN.2011.6005285","DOIUrl":"https://doi.org/10.1109/ROMAN.2011.6005285","url":null,"abstract":"Gesture is an important feature of social interaction, frequently used by human speakers to illustrate what speech alone cannot provide, e.g. to convey referential, spatial or iconic information. Accordingly, humanoid robots that are intended to engage in natural human-robot interaction should produce speech-accompanying gestures for comprehensible and believable behavior. But how does a robot's non-verbal behavior influence human evaluation of communication quality and the robot itself? To address this research question we conducted two experimental studies. Using the Honda humanoid robot we investigated how humans perceive various gestural patterns performed by the robot as they interact in a situational context. Our findings suggest that the robot is evaluated more positively when non-verbal behaviors such as hand and arm gestures are displayed along with speech. These findings were found to be enhanced when the participants were explicitly requested to direct their attention towards the robot during the interaction.","PeriodicalId":408015,"journal":{"name":"2011 RO-MAN","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114132557","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}