Morphology-Specific Convolutional Neural Networks for Tactile Object Recognition with a Multi-Fingered Hand

Satoshi Funabashi, Gang Yan, A. Geier, A. Schmitz, T. Ogata, S. Sugano
{"title":"Morphology-Specific Convolutional Neural Networks for Tactile Object Recognition with a Multi-Fingered Hand","authors":"Satoshi Funabashi, Gang Yan, A. Geier, A. Schmitz, T. Ogata, S. Sugano","doi":"10.1109/ICRA.2019.8793901","DOIUrl":null,"url":null,"abstract":"Distributed tactile sensors on multi-fingered hands can provide high-dimensional information for grasping objects, but it is not clear how to optimally process such abundant tactile information. The current paper explores the possibility of using a morphology-specific convolutional neural network (MS-CNN). uSkin tactile sensors are mounted on an Allegro Hand, which provides 720 force measurements (15 patches of uSkin modules with 16 triaxial force sensors each) in addition to 16 joint angle measurements. Consecutive layers in the CNN get input from parts of one finger segment, one finger, and the whole hand. Since the sensors give 3D (x, y, z) vector tactile information, inputs with 3 channels (x, y and z) are used in the first layer, based on the idea of such inputs for RGB images from cameras. Overall, the layers are combined, resulting in the building of a tactile map based on the relative position of the tactile sensors on the hand. Seven different combination variations were evaluated, and an over-95% object recognition rate with 20 objects was achieved, even though only one random time instance from a repeated squeezing motion of an object in an unknown pose within the hand was used as input.","PeriodicalId":6730,"journal":{"name":"2019 International Conference on Robotics and Automation (ICRA)","volume":"48 9 1","pages":"57-63"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA.2019.8793901","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

Distributed tactile sensors on multi-fingered hands can provide high-dimensional information for grasping objects, but it is not clear how to optimally process such abundant tactile information. The current paper explores the possibility of using a morphology-specific convolutional neural network (MS-CNN). uSkin tactile sensors are mounted on an Allegro Hand, which provides 720 force measurements (15 patches of uSkin modules with 16 triaxial force sensors each) in addition to 16 joint angle measurements. Consecutive layers in the CNN get input from parts of one finger segment, one finger, and the whole hand. Since the sensors give 3D (x, y, z) vector tactile information, inputs with 3 channels (x, y and z) are used in the first layer, based on the idea of such inputs for RGB images from cameras. Overall, the layers are combined, resulting in the building of a tactile map based on the relative position of the tactile sensors on the hand. Seven different combination variations were evaluated, and an over-95% object recognition rate with 20 objects was achieved, even though only one random time instance from a repeated squeezing motion of an object in an unknown pose within the hand was used as input.
形态特异性卷积神经网络在多指手触觉物体识别中的应用
多指手上的分布式触觉传感器可以为抓取物体提供高维信息,但如何对如此丰富的触觉信息进行优化处理尚不清楚。本文探讨了使用形态特异性卷积神经网络(MS-CNN)的可能性。uSkin触觉传感器安装在Allegro Hand上,除了16个关节角度测量外,还提供720个力测量(15块uSkin模块,每个模块有16个三轴力传感器)。CNN中的连续层从一个手指段、一个手指和整个手的部分中获得输入。由于传感器提供3D (x, y, z)矢量触觉信息,基于来自相机的RGB图像输入的想法,在第一层使用3通道(x, y和z)输入。总的来说,这些层是结合在一起的,从而根据触觉传感器在手上的相对位置构建触觉地图。对7种不同的组合变化进行了评估,即使只使用一个随机时间实例作为输入,也可以实现对20个物体的95%以上的物体识别率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信