Superhuman Performance in Tactile Material Classification and Differentiation with a Flexible Pressure-Sensitive Skin

A. Tulbure, B. Bäuml
{"title":"Superhuman Performance in Tactile Material Classification and Differentiation with a Flexible Pressure-Sensitive Skin","authors":"A. Tulbure, B. Bäuml","doi":"10.1109/HUMANOIDS.2018.8624987","DOIUrl":null,"url":null,"abstract":"In this paper, we show that a robot equipped with a flexible and commercially available tactile skin can exceed human performance in the challenging tasks of material classification, i.e., uniquely identifying a given material by touch alone, and of material differentiation, i.e., deciding if the materials in a given pair of materials are the same or different. For processing the high dimensional spatio-temporal tactile signal, we use a new tactile deep learning network architecture TactNet-II which is based on TactNet [1] and is significantly extended with recently described architectural enhancements and training methods. TactN et- Iireaches an accuracy for the material classification task as high as 95.0 %. For the material differentiation a new Siamese network based architecture is presented which reaches an accuracy as high as 95.4 %. All the results have been achieved on a new challenging dataset of 36 everyday household materials. In a thorough human performance experiment with 15 subjects we show that the human performance is significantly lower than the robot's performance for both tactile tasks.","PeriodicalId":433345,"journal":{"name":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HUMANOIDS.2018.8624987","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

In this paper, we show that a robot equipped with a flexible and commercially available tactile skin can exceed human performance in the challenging tasks of material classification, i.e., uniquely identifying a given material by touch alone, and of material differentiation, i.e., deciding if the materials in a given pair of materials are the same or different. For processing the high dimensional spatio-temporal tactile signal, we use a new tactile deep learning network architecture TactNet-II which is based on TactNet [1] and is significantly extended with recently described architectural enhancements and training methods. TactN et- Iireaches an accuracy for the material classification task as high as 95.0 %. For the material differentiation a new Siamese network based architecture is presented which reaches an accuracy as high as 95.4 %. All the results have been achieved on a new challenging dataset of 36 everyday household materials. In a thorough human performance experiment with 15 subjects we show that the human performance is significantly lower than the robot's performance for both tactile tasks.
柔性压敏皮肤触觉材料分类与区分的超人性能
在本文中,我们展示了配备柔性和商用触觉皮肤的机器人在材料分类(即仅通过触摸唯一识别给定材料)和材料区分(即决定给定一对材料中的材料是相同还是不同)等具有挑战性的任务中可以超越人类的表现。为了处理高维时空触觉信号,我们使用了一种新的触觉深度学习网络架构TactNet- ii,它基于TactNet[1],并通过最近描述的架构增强和训练方法进行了显著扩展。TactN et- ii3对材料分类任务的准确率高达95.0%。对于材料分类,提出了一种新的基于Siamese网络的结构,其准确率高达95.4%。所有的结果都是在一个包含36种日常家居材料的新的具有挑战性的数据集上实现的。在对15名受试者进行的全面的人类表现实验中,我们发现人类在这两项触觉任务中的表现都明显低于机器人的表现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信