{"title":"Efficient body part tracking using ridge data and data pruning","authors":"Yeonho Kim, Daijin Kim","doi":"10.1109/HUMANOIDS.2015.7363523","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363523","url":null,"abstract":"This paper proposes a model-based human pose estimation from a sequence of monocular depth images using ridge data and data pruning. The proposed method uses the ridge data that is defined as the local maxima in the distance map because it estimates the human pose robustly and fast due to its selective representation of body skeletons. The proposed method performs four functional subtasks sequentially: (1) it segments human depth silhouettes from depth images by executing floor removal, object segmentation, human detection and human identification, (2) it extracts ridge data from each segmented human depth silhouette by finding the local maxima over the distance map, (3) it generates initial human model parameters such as the lengths between two neighboring joints, and (4) it estimates the human pose by tracking the body joints in a hierarchical order of head, torso, and limbs and pruning illegal ridge data based on the joint length constraints. In pose estimation experiments on the benchmark dataset, SMMC-10, the proposed method achieved 0.9671 mean Average Precision (mAP) and 280 frames per second (fps). The experimental results over the SMMC-10 dataset show that the proposed method estimates the human pose fast and tracks the body joints accurately under various self-occlusion and fast moving condition.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126427226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Moondeep C. Shrestha, Ayano Kobayashi, Tomoya Onishi, Erika Uno, Hayato Yanagawa, Yuta Yokoyama, Mitsuhiro Kamezaki, A. Schmitz, S. Sugano
{"title":"An investigation into the social acceptance of using contact for inducing an obstructing human","authors":"Moondeep C. Shrestha, Ayano Kobayashi, Tomoya Onishi, Erika Uno, Hayato Yanagawa, Yuta Yokoyama, Mitsuhiro Kamezaki, A. Schmitz, S. Sugano","doi":"10.1109/HUMANOIDS.2015.7363482","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363482","url":null,"abstract":"In densely populated scenarios with cramped spaces, it is very difficult to achieve safe and efficient navigation without cooperation from humans. One way in which we can seek cooperation from humans is by using contact to influence them to give way. However, such a method may incur certain psychological implications and therefore requires an acceptability check to ensure whether such action is acceptable or not. For this purpose, we investigate the participant's subjective response towards robot-initiated touch during the course of navigation. We conducted a 2 (robotic experience vs. none) x 2 (warning vs. none) between-subject experiment with 44 people in which a mobile robotic platform exerted contact on an unaware and obstructing participant to make way towards its goal. Our results show that prior experience with robots produces slightly better response even though the results are not statistically significant. However, a verbal warning prior to contact yielded much more favorable response. In general, the participants did not find contact to be uncomfortable and were not opposed to robot-initiated contact if deemed necessary.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125971063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Control strategies for a humanoid robot to drive and then egress a utility vehicle for remote approach","authors":"Hyobin Jeong, Jaesung Oh, Mingeuk Kim, Kyungdon Joo, In-So Kweon, Jun-Ho Oh","doi":"10.1109/HUMANOIDS.2015.7363447","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363447","url":null,"abstract":"This paper proposes strategies for the driving and egress of a vehicle with a humanoid robot. To drive the vehicle, the RANSAC method was used to detect obstacles, and the Wagon model was used to control the steering and velocity of the vehicle with only a limited number of sensors which were installed on the humanoid robot. Additionally, a manual tele-operating method was used with the lane projection technique. For the egress motion, gain override and the Cartesian position/force control technique were used to interact with the vehicle structure. To overcome the disadvantages of a highly geared manipulator, a special technique was used that included modelled friction compensation and a non-complementary switching mode. DRC-HUBO+ used the proposed method to perform a vehicle driving and egress task in the DRC finals 2015.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"168 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121714669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"See what i mean-Probabilistic optimization of robot pointing gestures","authors":"Khurram Gulzar, V. Kyrki","doi":"10.1109/HUMANOIDS.2015.7363484","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363484","url":null,"abstract":"Humans use gestures such as pointing extensively in order to anchor linguistic expressions to objects in the physical world. Similarly gestures can be valuable in decentralized robotic systems, allowing communication between agents and transfer of symbolic meanings. Pointing gestures are especially valuable in crowded scenes where multiple possible matches are present. However, pointing in crowded scenes can itself remain ambiguous if the pointing direction is not carefully chosen. This paper proposes a probabilistic model for pointing and gesture detection accuracy. The model allows planning optimal pointing actions by minimizing the probability of pointing errors due to ambiguities and limited accuracy. We also describe how to measure the accuracy of an agent's pointing gesture and to calibrate the model for that agent. Experimental results suggest that the proposed model captures the qualitative behavior of pointing success well.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132504211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Independent joint learning in practice: Local error estimates to improve inverse dynamics control","authors":"Ken Caluwaerts, Jochen J. Steil","doi":"10.1109/HUMANOIDS.2015.7363439","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363439","url":null,"abstract":"Independent Joint Learning (IJL) was recently introduced as a learning-based approach to account for inverse dynamics (ID) model errors. The fundamental idea is to combine an ID model with learned torque error estimators that only rely on joint-local information. This approach improves task-to-task generalization and reduces learning times as each torque error estimators depends only on the state of a single joint instead of the global configuration. Herein, we adapt the IJL method to a real robotic platform, namely the COMAN compliant humanoid robot. We test the algorithm under different loading conditions in open and closed loop control (PD, forward non-linear control, and ID control). In our implementation, IJL becomes a flexible component that fits in between the output of an existing computed-torque controller and low-level motor drivers. Our results show that IJL reduces torque estimation errors in the open loop case and improves tracking performance in the closed loop case. Under varying loading conditions, IJL's performance is on par with and in some cases exceeds the adapted model (i.e. a modified ID model with updated inertial parameters). Finally, the compartmented design and limited number of assumptions of the algorithm allow it to be easily integrated into existing platforms.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133885325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bio-inspired walking for humanoid robots using feet with human-like compliance and neuromuscular control","authors":"L. Colasanto, N. V. D. Noot, A. Ijspeert","doi":"10.1109/HUMANOIDS.2015.7363518","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363518","url":null,"abstract":"The human foot plays a key role in human walking providing, among others, body support and propulsion, stability of the movement and impact absorption. These fundamental functionalities are accomplished by an extraordinarily rich bio-mechanical design. Nonetheless, humanoid robots follow different approaches to walk, hence, they generally implement rigid feet. In this study, we target the gap existing between the human foot and traditional humanoid-robot feet. More specifically, we evaluate the resulting advantages and draw-backs by implementing on a humanoid robot some of the properties and functionalities embedded in the human foot. To this end, we extract the physical characteristics of a prosthetic foot to develop a human-like foot model. This foot model is systematically tested in simulation in human-like walking tasks on flat ground and on uneven terrain. The movement of the limbs is generated by a muscle-reflex controller based on a simplified model of the human limbs. The gait features and the walking stability are evaluated for the human-like foot and compared with the results produced using rigid feet.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132841502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning and reproduction of valence-related communicative gesture","authors":"Ju-Hwan Seo, Jeong-Yean Yang, D. Kwon","doi":"10.1109/HUMANOIDS.2015.7363541","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363541","url":null,"abstract":"This paper proposes a robotic system capable of learning and reproducing robot gestures based on the Learning by Demonstration (LbD) approach. We focused on those gestures that are used for communicative purposes in human-human interaction. These gestures appear in various motions and this variation causes a delicate difference in the meaning and feeling that is delivered. While some (psychology and ethology) studies have shown that these variations are related to factors such as emotion, intimacy, and intensity, the best way to achieve robotic learning of these variations to allow for the reproduction of these motions remains unclear. With this motivation, we used the term `valence' from psychology as a causal factor and tried to build a system capable of representing and learning relations between `valence' factor and motion variation. Though there are many variations, we especially focus on the number of repetitions in this work. The system can segment a given motion into a set of unit motions by using states constructed by Gaussian Mixture Model(GMM) and Bayesian Network(BN) model is used to represent transition probabilities between states. In the model, transition probabilities are affected by `valence' value and appropriate motion corresponding to given `valence' value can be reproduced. Proposed system is applied to waving-hand motion of humanoid robot DARwIn-OP and we evaluate the validity of the system.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134486800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nejc Likar, B. Nemec, L. Žlajpah, Shingo Ando, A. Ude
{"title":"Adaptation of bimanual assembly tasks using iterative learning framework","authors":"Nejc Likar, B. Nemec, L. Žlajpah, Shingo Ando, A. Ude","doi":"10.1109/HUMANOIDS.2015.7363457","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363457","url":null,"abstract":"The paper deals with the adaptation of bimanual assembly tasks. First, the desired policy is shown by human demonstration using kinesthetic guidance, where both trajectories and interaction forces are captured. Captured entities are portioned to absolute and relative coordinates. During the execution, small discrepancies in object geometry as well as the influence of an imperfect control can result in large contact forces. Force control can diminish the above mentioned problems only to some extent. Therefore, we propose a framework that iteratively modifies the original demonstrated trajectory in order to increase the performance of the typical assembly tasks. The approach is validated on bimanual peg in a hole task using two KUKA LWR robots.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114340094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An optimal control approach to reconstruct human gait dynamics from kinematic data","authors":"Martin L. Felis, K. Mombaur, A. Berthoz","doi":"10.1109/HUMANOIDS.2015.7363490","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363490","url":null,"abstract":"A common approach to record full-body human movement data is by using marker based motion capture systems. To obtain dynamic gait data such as joint torques and ground reaction forces additional measurement devices have to be employed that pose restrictions on where feet have to be placed during the recording. In this paper we use articulated rigid multibody models and optimal control methods to recover dynamic gait data solely from kinematic data. Our approach is independent from the used marker set and creates the rigid multibody model and computes all controls for the model such that when applied to the model, it closely reproduces the originally recorded motion. To achieve this there are two steps involved: i) create a subject-specific rigid multibody model of the recorded person and used marker set and compute the joint kinematics using inverse kinematics. ii) reconstruct the gait dynamics by solving an optimal control problem. For step i) we created a parameterize human model HEIMAN and a graphical user interface PUPPETEER that facilitates creation of the subject specific model and the motion capture mapping. For ii) we use MUSCOD-II, which implements the direct multiple-shooting method. We apply our method on 15 emotional human walking motions to compare joint angle and torque patterns of different emotions.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114730157","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Passault, Q. Rouxel, L. Hofer, S. N'Guyen, O. Ly
{"title":"Low-cost force sensors for small size humanoid robot","authors":"G. Passault, Q. Rouxel, L. Hofer, S. N'Guyen, O. Ly","doi":"10.1109/HUMANOIDS.2015.7363498","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2015.7363498","url":null,"abstract":"Summary form only given. We present a new design of foot pressure sensors in the context of small low-cost robots. We design the robot in the context of RoboCup kid-size humanoid league (i.e. the robot size must be lower than 90 cm). A new challenge is to walk on artificial grass of 3 cm height, which essentially means a soft irregular floor. In this context, one cannot use full 6-axis force sensors for cost and mechanical integration reasons. Classically, FSR sensors are used to handle foot pressure with unclear efficiency. Instead of that, we propose to use low cost strain gauge originally designed for scales. The strain gauge is made of resistors network glued on a aluminium beam. In our design, the aluminium beams are themselves full parts of the mechanical structure of the foot. Therefore, we measure directly the deformation of the foot and thus the force applied on it in a robust way. We use this system in order to measure the position of the centre of pressure under the foot, in order to balance motor primitives of the robot: perturbation rejection, locomotion, shoot. First experiments are promising and we plan to set up a rigorous comparison with the ZMP.","PeriodicalId":417686,"journal":{"name":"2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114904344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}