{"title":"Robust Affordable 3D Haptic Sensation via Learning Deformation Patterns","authors":"Huanbo Sun, G. Martius","doi":"10.1109/HUMANOIDS.2018.8625064","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625064","url":null,"abstract":"Haptic sensation is an important modality for interacting with the real world. This paper proposes a general framework of inferring haptic forces on the surface of a 3D structure from internal deformations using a small number of physical sensors instead of employing dense sensor arrays. Using machine learning techniques, we optimize the sensor number and their placement and are able to obtain high-precision force inference for a robotic limb using as few as 9 sensors. For the optimal and sparse placement of the measurement units (strain gauges), we employ data-driven methods based on data obtained by finite element simulation. We compare data-driven approaches with model-based methods relying on geometric distance and information criteria such as Entropy and Mutual Information. We validate our approach on a modified limb of the “Poppy” robot [1] and obtain 8 mm localization precision.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126104788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Sotiropoulos, M. Roa, M. Martins, W. Friedl, H. Mnyusiwalla, P. Triantafyllou, G. Deacon
{"title":"A Benchmarking Framework for Systematic Evaluation of Compliant Under-Actuated Soft End Effectors in an Industrial Context","authors":"P. Sotiropoulos, M. Roa, M. Martins, W. Friedl, H. Mnyusiwalla, P. Triantafyllou, G. Deacon","doi":"10.1109/HUMANOIDS.2018.8624924","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8624924","url":null,"abstract":"This paper presents an approach for systematic evaluation of robotic end effectors for an industrial use case that handles delicate, deformable, non-regular objects such as fruits and vegetables. To handle these objects, soft under-actuated hands are the most promising technology so far. However, the approach directions suitable for grasping objects are not usually easy to establish due to the under-actuation effect; therefore, we propose an experimental protocol to serve as framework for data collection, which aims to assess the best directions for grasping using a particular end effector. This protocol, focused on reproducibility and comparability, allows for a better understanding of how a particular hand embodiment influences grasping success for individual products. The protocol can also be used as an effective tool for redesign, and it was applied to evaluate two well-known soft hands (RBO Hand and Pisa/IIT SoftHand) for handling groceries.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129734300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Henrique Ferrolho, W. Merkt, Yiming Yang, V. Ivan, S. Vijayakumar
{"title":"Whole-Body End-Pose Planning for Legged Robots on Inclined Support Surfaces in Complex Environments","authors":"Henrique Ferrolho, W. Merkt, Yiming Yang, V. Ivan, S. Vijayakumar","doi":"10.1109/HUMANOIDS.2018.8625026","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625026","url":null,"abstract":"Planning balanced whole-body reaching configurations is a fundamental problem in humanoid robotics on which manipulation and locomotion planners depend on. While finding valid whole-body configurations in free space and on flat terrains is relatively straightforward, the problem becomes extremely challenging when obstacle avoidance is taken into account, and when balancing on more complex terrains, such as inclined supports or steps. Previous work using Paired Forward-Inverse Dynamic Reachability Maps demonstrated fast end-pose planning on flat terrains at different heights by decomposing the kinematic structure and leveraging combinatorics. In this paper, we present an efficient whole-body end-pose planning framework capable of finding collision-free whole-body configurations in complex environments and on sloped support regions. The main contributions in this paper are twofold: (i) the integration of contact property information of support regions into both precomputation and online planning stages, including whole-body static equilibrium robustness, and (ii) the proposal of a more informed and meaningful sampling strategy for the lower-body. We focus on humanoid robots throughout the paper, but all the principles can be applied to legged platforms other than bipedal robots. We demonstrate our method on the NASA Valkyrie humanoid platform with 38 Degrees of Freedom (DoF) over inclined supports. Analysis of the results indicate both higher success rates – greater than 95 % and 80 % on obstacle-free and highly cluttered environments, respectively – and shorter computation times compared to previous methods.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127149447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Balance Stabilization with Angular Momentum Damping Derived from the Reaction Null-Space","authors":"Ryotaro Hinata, D. Nenchev","doi":"10.1109/HUMANOIDS.2018.8624933","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8624933","url":null,"abstract":"A balance stabilizer is proposed that has the capability of absorbing high-energy collisions without reactive stepping. The stabilizer is based on the spatial dynamics formulation and has the unique feature that the trunk rotation can be specified in an independent way from the desired rate of change of the system (centroidal) angular momentum. The formulation is based on the momentum equilibrium principle for floating-base robots and the relativity of angular momentum revealed in the companion paper [1]. The stabilizer injects angular momentum damping via the so-called relative angular acceleration (RAA) derived from the reaction null-space (RNS) of the system. The damping is used to increase the robustness of the balance stabilizer at critical states such as foot roll. It is shown how to embed the RAA stabilizer into a joint-torque controller whereby the motion and force optimization tasks are solved in a single step, yielding a formulation that does not rely upon a general solver. The performance of the controller is examined via simulations whereby external impact-type disturbances are applied to the robot. One part of the impact energy is accommodated via the trunk rotations by lowering the respective PD feedback gains immediately after impact onset. It is then dissipated with higher gains, while recovering the stability of the posture. Another part of the impact energy yields foot roll; this part is dissipated with the angular momentum damping realized through an appropriate arm motion. When in a single stance, the angular momentum damping control yields a movement in the swing leg in addition to that in the arms. The motion in the leg injects additional angular momentum damping, such that a high-energy impact can be accommodated that would otherwise require a reactive stepping.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126385099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Regier, Andres Milioto, P. Karkowski, C. Stachniss, Maren Bennewitz
{"title":"Classifying Obstacles and Exploiting Knowledge About Classes for Efficient Humanoid Navigation","authors":"P. Regier, Andres Milioto, P. Karkowski, C. Stachniss, Maren Bennewitz","doi":"10.1109/HUMANOIDS.2018.8625036","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625036","url":null,"abstract":"In this paper, we propose a new approach to humanoid navigation through cluttered environments that exploits knowledge about different obstacle classes and selects appropriate robot actions. To classify objects from RGB images and decide whether an obstacle can be overcome by the robot with a corresponding action, e.g., by pushing or carrying it aside or stepping over or onto it, we train a convolutional neural network (CNN). Based on the associated action costs, we compute a cost grid of the environment on which a 2D path can be efficiently planned. This path encodes the necessary actions that need to be carried out to reach the goal. We implemented our framework in ROS and tested it in various scenarios with a Nao robot. As the experiments demonstrate, using the CNN the robot can robustly classify the observed obstacles into the different classes and exploit this information to efficiently compute solution paths. Our system finds paths also through regions where traditional planning methods are not able to calculate a solution or require substantially more time.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128757998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Natural Oscillation and Optimal Gaits for Humanoid Biped Models","authors":"U. Khan, Zhiyong Chen","doi":"10.1109/HUMANOIDS.2018.8625067","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625067","url":null,"abstract":"Gait design for a biped robot is an intriguing problem. The objective is to replicate an efficient gait according to the jogging dynamics of a human in a biped robot. This paper aims to find an optimal gait for jogging dynamics of a biped robot on a continuous-time nonlinear mathematical model. The nonlinear model is approximated using the describing function method and requires the gait to be sinusoidal. It is revealed that the natural oscillation of an undamped biped robot is also an optimal gait. The optimal frequency reduces to compensate for damping. The characteristic of the optimal gait is further studied in extensive simulations.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"102 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130387105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Namiko Saito, Kitae Kim, Shingo Murata, T. Ogata, S. Sugano
{"title":"Tool-Use Model Considering Tool Selection by a Robot Using Deep Learning","authors":"Namiko Saito, Kitae Kim, Shingo Murata, T. Ogata, S. Sugano","doi":"10.1109/HUMANOIDS.2018.8625048","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625048","url":null,"abstract":"We propose a tool-use model that can select tools that require neither labeling nor modeling of the environment and actions. With this model, a robot can choose a tool by itself and perform the operation that matches a human command and the environmental situation. To realize this, we use deep learning to train sensory motor data recorded during tool selection and tool use as experienced by a robot. The experience includes two types of selection, namely according to function and according to size, thereby allowing the robot to handle both situations. For evaluation, the robot is required to generate motion either in an untrained situation or using an untrained tool. We confirm that the robot can choose and use a tool that is suitable for achieving the target task.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116789922","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marco Laghi, Michele Maimeri, Mathieu Marchand, Clara Leparoux, M. Catalano, A. Ajoudani, A. Bicchi
{"title":"Shared-Autonomy Control for Intuitive Bimanual Tele-Manipulation","authors":"Marco Laghi, Michele Maimeri, Mathieu Marchand, Clara Leparoux, M. Catalano, A. Ajoudani, A. Bicchi","doi":"10.1109/HUMANOIDS.2018.8625047","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625047","url":null,"abstract":"Existing dual-arm teleoperation systems function on one-to-one coupling of the human and robotic arms to fully exploit the user's dexterity during bimanual tele-manipulation. While the individual coordination of the robot end-effectors can be necessary for complex and asymmetric tasks, it may result in a cumbersome user experience during symmetric bimanual tasks (e.g. manipulating and carrying objects). In this paper we propose a novel framework that includes the one-to-one direct control and a new shared autonomy strategy. The user can autonomously choose between the two, and if the new one is selected the robots move in a coordinated way, in which desired positions are extrapolated from the movements and gestures of just one users arm. These gesture commands are interpreted and handled by the control, with the purpose of unloading the users cognitive burden. Lastly, the tele-impedance paradigm, i.e., the remote control of robot impedance and position references, is applied to both controls, to improve remote physical interaction performances. The paper reports on the overall proposed architecture, its implementation and its preliminary validation trough a multi subject experimental campaign.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132349960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Downsizing the Motors of a Biped Robot Using a Hydraulic Direct Drive System","authors":"J. Shimizu, T. Otani, K. Hashimoto, A. Takanishi","doi":"10.1109/HUMANOIDS.2018.8624941","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8624941","url":null,"abstract":"Biped robots require a high power to be provided alternately on their two legs while walking, hopping, and running. However, the mounting of high-power and large electrical motors is challenging in conventional mechanical transmission systems because of space limitations. To address this issue, we employ herein a combination of hydraulic and transmission systems with an independent driving mode and a power-shared driving mode. In the independent driving mode, an actuator can be independently controlled based on flow-control, and pressure loss can be reduced. In the power-shared driving mode, actuators can also be controlled based on flow-control, and this mode allows the motor power of the left and right legs to be shared. We also employ a simulation to evaluate the proposed novel system and confirm that the motor power could be reduced by 35.6% for the hopping movement. This result shows that the rated output of the required motor can be reduced, and the selection of smaller and lighter motors is possible for installation in biped robots.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132561981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Henrique Siqueira, A. Sutherland, Pablo V. A. Barros, Matthias Kerzel, S. Magg, S. Wermter
{"title":"Disambiguating Affective Stimulus Associations for Robot Perception and Dialogue","authors":"Henrique Siqueira, A. Sutherland, Pablo V. A. Barros, Matthias Kerzel, S. Magg, S. Wermter","doi":"10.1109/HUMANOIDS.2018.8625012","DOIUrl":"https://doi.org/10.1109/HUMANOIDS.2018.8625012","url":null,"abstract":"Effectively recognising and applying emotions to interactions is a highly desirable trait for social robots. Implicitly understanding how subjects experience different kinds of actions and objects in the world is crucial for natural HRI interactions, with the possibility to perform positive actions and avoid negative actions. In this paper, we utilize the NICO robot's appearance and capabilities to give the NICO the ability to model a coherent affective association between a perceived auditory stimulus and a temporally asynchronous emotion expression. This is done by combining evaluations of emotional valence from vision and language. NICO uses this information to make decisions about when to extend conversations in order to accrue more affective information if the representation of the association is not coherent. Our primary contribution is providing a NICO robot with the ability to learn the affective associations between a perceived auditory stimulus and an emotional expression. NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system that rectifies emotional expression incoherences. The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129146160","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}