Tingting Mi, D. Que, Senlin Fang, Zhenning Zhou, Chaoxiang Ye, Chengliang Liu, Zhengkun Yi, Xinyu Wu
{"title":"基于图卷积网络的触觉抓握稳定性分类","authors":"Tingting Mi, D. Que, Senlin Fang, Zhenning Zhou, Chaoxiang Ye, Chengliang Liu, Zhengkun Yi, Xinyu Wu","doi":"10.1109/RCAR52367.2021.9517085","DOIUrl":null,"url":null,"abstract":"One of the challenges for robots to grasp unknown objects is to predict whether objects will fall at the beginning of grasping. Evaluating robotic grasp state accurately and efficiently is a significant step to address this issue. In this paper, based on the different fusion approaches of multi-sensor tactile signals, we propose two novel methods based on Graph Convolution Network (GCN) for robotic stability classification. Specifically, we propose two deep learning methods including GCN based on data-level fusion (GCN-DF) and GCN based on feature-level fusion (GCN-FF). We explore the optimal parameters for transforming sensor signals into a graph structure. Furthermore, we verify the effectiveness of the proposed methods on the BioTac Grasp Stability (BiGS) dataset. The experimental results prove that the proposed approaches achieve higher classification accuracy than Support Vector Machine (SVM) and Long Short-Term Memory (LSTM).","PeriodicalId":232892,"journal":{"name":"2021 IEEE International Conference on Real-time Computing and Robotics (RCAR)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Tactile Grasp Stability Classification Based on Graph Convolutional Networks\",\"authors\":\"Tingting Mi, D. Que, Senlin Fang, Zhenning Zhou, Chaoxiang Ye, Chengliang Liu, Zhengkun Yi, Xinyu Wu\",\"doi\":\"10.1109/RCAR52367.2021.9517085\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One of the challenges for robots to grasp unknown objects is to predict whether objects will fall at the beginning of grasping. Evaluating robotic grasp state accurately and efficiently is a significant step to address this issue. In this paper, based on the different fusion approaches of multi-sensor tactile signals, we propose two novel methods based on Graph Convolution Network (GCN) for robotic stability classification. Specifically, we propose two deep learning methods including GCN based on data-level fusion (GCN-DF) and GCN based on feature-level fusion (GCN-FF). We explore the optimal parameters for transforming sensor signals into a graph structure. Furthermore, we verify the effectiveness of the proposed methods on the BioTac Grasp Stability (BiGS) dataset. The experimental results prove that the proposed approaches achieve higher classification accuracy than Support Vector Machine (SVM) and Long Short-Term Memory (LSTM).\",\"PeriodicalId\":232892,\"journal\":{\"name\":\"2021 IEEE International Conference on Real-time Computing and Robotics (RCAR)\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Conference on Real-time Computing and Robotics (RCAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/RCAR52367.2021.9517085\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Real-time Computing and Robotics (RCAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCAR52367.2021.9517085","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tactile Grasp Stability Classification Based on Graph Convolutional Networks
One of the challenges for robots to grasp unknown objects is to predict whether objects will fall at the beginning of grasping. Evaluating robotic grasp state accurately and efficiently is a significant step to address this issue. In this paper, based on the different fusion approaches of multi-sensor tactile signals, we propose two novel methods based on Graph Convolution Network (GCN) for robotic stability classification. Specifically, we propose two deep learning methods including GCN based on data-level fusion (GCN-DF) and GCN based on feature-level fusion (GCN-FF). We explore the optimal parameters for transforming sensor signals into a graph structure. Furthermore, we verify the effectiveness of the proposed methods on the BioTac Grasp Stability (BiGS) dataset. The experimental results prove that the proposed approaches achieve higher classification accuracy than Support Vector Machine (SVM) and Long Short-Term Memory (LSTM).