M. Pardowitz, R. Haschke, Jochen J. Steil, H. Ritter
{"title":"Gestalt-based action segmentation for robot task learning","authors":"M. Pardowitz, R. Haschke, Jochen J. Steil, H. Ritter","doi":"10.1109/ICHR.2008.4756003","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4756003","url":null,"abstract":"In programming by demonstration (PbD) systems, the problem of task segmentation and task decomposition has not been addressed with satisfactory attention. In this article we propose a method relying on psychological gestalt theories originally developed for visual perception and apply it to the domain of action segmentation. We propose a computational model for gestalt-based segmentation called competitive layer model (CLM). The CLM relies on features mutually supporting or inhibiting each other to form segments by competition. We analyze how gestalt laws for actions can be learned from human demonstrations and how they can be beneficial to the CLM segmentation method. We validate our approach with two reported experiments on action sequences and present the results obtained from those experiments.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129177612","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Kinematic and dynamic analogies between planar biped robots and the reaction mass pendulum (RMP) model","authors":"A. Goswami","doi":"10.1109/ICHR.2008.4755943","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755943","url":null,"abstract":"In order to simplify dynamic analysis, humanoid robots are often abstracted with various versions of the inverted pendulum model. However, most of these models do not explicitly characterize the robotpsilas rotational inertia, a critical component of its dynamics, and especially of its balance. To remedy this, we have earlier introduced the reaction mass pendulum (RMP), an extension of the inverted pendulum, which models the rotational inertia and angular momentum of a robot through its centroidal composite rigid body (CCRB) inertia. However, we presented only the kinematic mapping between a robot and its corresponding RMP. Focussing in-depth on planar mechanisms, here we derive the dynamic equations of the RMP and explicitly compute the parameters that it must possess in order to establish equivalence with planar compass gait robot. In particular, we show that, a) an angular momentum equality between the robot and RMP does not necessarily guarantee kinetic energy equality, and b) a cyclic robot gait may not result in a cyclic RMP movement. The work raises the broader question of how quantitatively similar the simpler models of humanoid robot must be in order for them to be of practical use.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127675145","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A control law for human like walking biped robot SHERPA based on a control and a ballistic phase - application on the cart-table model","authors":"Marc Bachelier, A. Chemori, S. Krut","doi":"10.1109/ICHR.2008.4755953","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755953","url":null,"abstract":"This work proposes a new control approach for biped walking robots. Its purpose is to make human-like robots walk more smoothly and more efficiently with regard to energy. Thus, it is based on the decomposition of a step into two phases: a control phase which prepare a ballistic phase. As a first step towards more complex studies, the tools are simple and efficient: Lagrangian model, Newtonpsilas impact law, non-linear quadratic optimization problems used for trajectory planning and partial feedback linearization used for trajectory tracking. Although the final prototype will be the biped robot SHERPA, this control law has been implemented and tested on a simpler one: the cart-table. Numerous simulation results are presented with two concrete examples.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124548919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ignazio Infantino, C. Lodato, S. Lopes, Filippo Vella
{"title":"Human-humanoid interaction by an intentional system","authors":"Ignazio Infantino, C. Lodato, S. Lopes, Filippo Vella","doi":"10.1109/ICHR.2008.4756007","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4756007","url":null,"abstract":"The paper describes a framework for developing of an intentional vision system oriented to human-humanoid interaction. Such system will be able to recognize user faces, to recognize and tracking human postures by visual perception. The described framework is organized on two modules mapped on the corresponding outputs to obtain: intentional perception of faces; intentional perception of human body movements. Moreover a possible integration of intentional vision module in a complete cognitive architecture is proposed, and knowledge management and reasoning is allowed by a suitable OWL-DL ontology. In particular, the ontological knowledge approach is employed for human behaviour and expression comprehension while stored user habits are used for building a semantically meaningful structure for perceiving the human wills. A semantic description of user wills is formulated in terms of the symbolic features produced by the intentional vision system. The sequences of symbolic features belonging to a domain specific ontology are employed to infer human wills, and to perform suitable actions.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117331006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kazumasa Murata, K. Nakadai, Ryu Takeda, HIroshi G. Okuno, Toyotaka Torii, Yuji Hasegawa, H. Tsujino
{"title":"A beat-tracking robot for human-robot interaction and its evaluation","authors":"Kazumasa Murata, K. Nakadai, Ryu Takeda, HIroshi G. Okuno, Toyotaka Torii, Yuji Hasegawa, H. Tsujino","doi":"10.1109/ICHR.2008.4755935","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755935","url":null,"abstract":"Human-robot interaction through music in real environments is essential for humanoids, because such a robot makes people enjoyable. We thus developed a beat-tracking robot which steps, sings, and scats according to musical beats predicted by using a robot-embedded microphone, as a first step to realize a robot which makes a music session with people. This paper first describes the beat-tracking robot, and then evaluated it in detail at the following three points: adaptation to tempo changes, robustness of environmental noises including periodic noises generated by stepping, singing and scatting, and human-robot interaction by using a clapping sound. The results showed that our beat-tracking robot improved noise-robustness and adaptation to tempo changes drastically so that it can make a simple sound session with people.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115356332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Fujie, D. Watanabe, Yuhi Ichikawa, Hikaru Taniyama, Kosuke Hosoya, Yoichi Matsuyama, Tetsunori Kobayashi
{"title":"Multi-modal integration for personalized conversation: Towards a humanoid in daily life","authors":"S. Fujie, D. Watanabe, Yuhi Ichikawa, Hikaru Taniyama, Kosuke Hosoya, Yoichi Matsuyama, Tetsunori Kobayashi","doi":"10.1109/ICHR.2008.4756014","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4756014","url":null,"abstract":"Humanoid with spoken language communication ability is proposed and developed. To make humanoid live with people, spoken language communication is fundamental because we use this kind of communication every day. However, due to difficulties of speech recognition itself and implementation on the robot, a robot with such an ability has not been developed. In this study, we propose a robot with the technique implemented to overcome these problems. This proposed system includes three key features, image processing, sound source separation, and turn-taking timing control. Processing image captured with camera mounted on the robotpsilas eyes enables to find and identify whom the robot should talked to. Sound source separation enables distant speech recognition, so that people need no special device, such as head-set microphones. Turn-taking timing control is often lacked in many conventional spoken dialogue system, but this is fundamental because the conversation proceeds in real-time. The effectiveness of these elements as well as the example of conversation are shown in experiments.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"17 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132348975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Study on humanoid robot systems: An energy approach","authors":"L. Michieli, F. Nori, A. P. Prato, G. Sandini","doi":"10.1109/ICHR.2008.4755948","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755948","url":null,"abstract":"This work is about the energy analysis of a humanoid robotic arm, seen as complex energy chain. The problem of energy efficiency in robotics is becoming crucial in order to make robots achieve an increasing number of tasks in cooperation with humans or in substitution of them. Our approach consists in representing the humanoid robot as an isolated energy systems. We developed a simulation platform suitable for modelling the kinematics, dynamics, and energy balances of a real humanoid robotic arm. The model was validated by an accurate comparison with the real robot. Then, we performed a first compared study of the motion dynamics of the simulated robot arm and the energy flows which crossed its energy converters, with respect to a set of different motion control strategies. Moreover, we conducted a preliminary investigation on the possibility of saving and recovering energy during robot motion.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131239538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rosen Diankov, S. Srinivasa, D. Ferguson, J. Kuffner
{"title":"Manipulation planning with caging grasps","authors":"Rosen Diankov, S. Srinivasa, D. Ferguson, J. Kuffner","doi":"10.1109/ICHR.2008.4755966","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755966","url":null,"abstract":"We present a novel motion planning algorithm for performing constrained tasks such as opening doors and drawers by robots such as humanoid robots or mobile manipulators. Previous work on constrained manipulation transfers rigid constraints imposed by the target object motion directly into the robot configuration space. This often unnecessarily restricts the allowable robot motion, which can prevent the robot from performing even simple tasks, particularly if the robot has limited reachability or low number of joints. Our method computes ldquocaging graspsrdquo specific to the object and uses efficient search algorithms to produce motion plans that satisfy the task constraints. The major advantages of our technique significantly increase the range of possible motions of the robot by not having to enforce rigid constraints between the end-effector and the target object. We illustrate our approach with experimental results and examples running on two robot platforms.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131295554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I. Gaiser, S. Schulz, A. Kargov, H. Klosek, A. Bierbaum, C. Pylatiuk, R. Oberle, T. Werner, T. Asfour, G. Bretthauer, R. Dillmann
{"title":"A new anthropomorphic robotic hand","authors":"I. Gaiser, S. Schulz, A. Kargov, H. Klosek, A. Bierbaum, C. Pylatiuk, R. Oberle, T. Werner, T. Asfour, G. Bretthauer, R. Dillmann","doi":"10.1109/ICHR.2008.4755987","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4755987","url":null,"abstract":"This paper presents the new robotic FRH-4 hand. The FRH-4 hand constitutes a new hybrid concept of an anthropomorphic five fingered hand and a three jaw robotic gripper. The hand has a humanoid appearance while maintaining the precision of a robotic gripper. Since it is actuated with flexible fluidic actuators, it exhibits an excellent power to weight ratio. These elastic actuators also ensure that the hand is safe for interacting with humans. In order to fully control the joints, it is equipped with position sensors on all of the 11 joints. The hand is also fitted with tactile sensors based on cursor navigation sensor elements, which allows it to have grasping feedback and the ability for exploration.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125327982","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Grasping and guiding a human with a humanoid robot","authors":"N. Gorges, A. Schmid, Dirk Göger, H. Wörn","doi":"10.1109/ICHR.2008.4756015","DOIUrl":"https://doi.org/10.1109/ICHR.2008.4756015","url":null,"abstract":"This paper presents a novel approach for tightly-coupled human-robot interaction that consists of a robot actively grasping and guiding a human being. We propose a multi-stage procedure that particularly considers exceptional conditions which allow the human to quit the interaction at any stage. The system comprises a combination of different sensor modalities to supervise the grasping and guiding procedure and to guarantee a safe human-robot interaction. In particular, a capacitive sensor in the palm of the robot hand allows a contactless detection of the human while approaching. Finally, experimental results are presented.","PeriodicalId":402020,"journal":{"name":"Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots","volume":"17 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113976365","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}