Bahareh Abbasi, M. Sharifzadeh, E. Noohi, S. Parastegari, M. Žefran
{"title":"Grasp Taxonomy for Robot Assistants Inferred from Finger Pressure and Flexion","authors":"Bahareh Abbasi, M. Sharifzadeh, E. Noohi, S. Parastegari, M. Žefran","doi":"10.1109/ISMR.2019.8710191","DOIUrl":null,"url":null,"abstract":"Grasp is an integral part of manipulation actions in activities of daily living and programming by demonstration is a powerful paradigm for teaching the assistive robots how to perform a grasp. Since finger configuration and finger force are the fundamental features that need to be controlled during a grasp, using these variables is a natural choice for learning by demonstration. An important question then becomes whether the existing grasp taxonomies are appropriate when one considers these modalities. The goal of our paper is to answer this question by investigating grasp patterns that can be inferred from a static analysis of the grasp data, as the object is securely grasped. Human grasp data is measured using a newly developed data glove. The data includes pressure sensor measurements from eighteen areas of the hand, and measurements from bend sensors placed at finger joints. The pressure sensor measurements are calibrated and mapped into force by employing a novel data-driven approach. Unsupervised learning is used to identify patterns for different grasp types. Multiple clustering algorithms are used to partition the data. When the results are taken in aggregate, 25 human grasp types are reduced to 9 different clusters.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"308 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Symposium on Medical Robotics (ISMR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMR.2019.8710191","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Grasp is an integral part of manipulation actions in activities of daily living and programming by demonstration is a powerful paradigm for teaching the assistive robots how to perform a grasp. Since finger configuration and finger force are the fundamental features that need to be controlled during a grasp, using these variables is a natural choice for learning by demonstration. An important question then becomes whether the existing grasp taxonomies are appropriate when one considers these modalities. The goal of our paper is to answer this question by investigating grasp patterns that can be inferred from a static analysis of the grasp data, as the object is securely grasped. Human grasp data is measured using a newly developed data glove. The data includes pressure sensor measurements from eighteen areas of the hand, and measurements from bend sensors placed at finger joints. The pressure sensor measurements are calibrated and mapped into force by employing a novel data-driven approach. Unsupervised learning is used to identify patterns for different grasp types. Multiple clustering algorithms are used to partition the data. When the results are taken in aggregate, 25 human grasp types are reduced to 9 different clusters.